00:00:00.001 Started by upstream project "autotest-spdk-master-vs-dpdk-main" build number 4085 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3675 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.098 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.099 The recommended git tool is: git 00:00:00.099 using credential 00000000-0000-0000-0000-000000000002 00:00:00.100 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.155 Fetching changes from the remote Git repository 00:00:00.159 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.227 Using shallow fetch with depth 1 00:00:00.227 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.227 > git --version # timeout=10 00:00:00.289 > git --version # 'git version 2.39.2' 00:00:00.289 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.328 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.328 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:09.016 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:09.028 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:09.039 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:09.040 > git config core.sparsecheckout # timeout=10 00:00:09.051 > git read-tree -mu HEAD # timeout=10 00:00:09.066 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:09.087 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:09.087 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:09.167 [Pipeline] Start of Pipeline 00:00:09.180 [Pipeline] library 00:00:09.183 Loading library shm_lib@master 00:00:09.183 Library shm_lib@master is cached. Copying from home. 00:00:09.199 [Pipeline] node 00:00:09.215 Running on WFP49 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:09.217 [Pipeline] { 00:00:09.228 [Pipeline] catchError 00:00:09.230 [Pipeline] { 00:00:09.244 [Pipeline] wrap 00:00:09.253 [Pipeline] { 00:00:09.261 [Pipeline] stage 00:00:09.263 [Pipeline] { (Prologue) 00:00:09.483 [Pipeline] sh 00:00:09.769 + logger -p user.info -t JENKINS-CI 00:00:09.790 [Pipeline] echo 00:00:09.793 Node: WFP49 00:00:09.803 [Pipeline] sh 00:00:10.110 [Pipeline] setCustomBuildProperty 00:00:10.125 [Pipeline] echo 00:00:10.128 Cleanup processes 00:00:10.133 [Pipeline] sh 00:00:10.416 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:10.416 477711 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:10.429 [Pipeline] sh 00:00:10.712 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:10.712 ++ grep -v 'sudo pgrep' 00:00:10.712 ++ awk '{print $1}' 00:00:10.712 + sudo kill -9 00:00:10.712 + true 00:00:10.727 [Pipeline] cleanWs 00:00:10.737 [WS-CLEANUP] Deleting project workspace... 00:00:10.737 [WS-CLEANUP] Deferred wipeout is used... 00:00:10.743 [WS-CLEANUP] done 00:00:10.749 [Pipeline] setCustomBuildProperty 00:00:10.768 [Pipeline] sh 00:00:11.050 + sudo git config --global --replace-all safe.directory '*' 00:00:11.152 [Pipeline] httpRequest 00:00:11.546 [Pipeline] echo 00:00:11.548 Sorcerer 10.211.164.20 is alive 00:00:11.557 [Pipeline] retry 00:00:11.560 [Pipeline] { 00:00:11.572 [Pipeline] httpRequest 00:00:11.576 HttpMethod: GET 00:00:11.577 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:11.577 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:11.600 Response Code: HTTP/1.1 200 OK 00:00:11.601 Success: Status code 200 is in the accepted range: 200,404 00:00:11.601 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:30.685 [Pipeline] } 00:00:30.702 [Pipeline] // retry 00:00:30.711 [Pipeline] sh 00:00:30.996 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:31.012 [Pipeline] httpRequest 00:00:31.423 [Pipeline] echo 00:00:31.425 Sorcerer 10.211.164.20 is alive 00:00:31.437 [Pipeline] retry 00:00:31.439 [Pipeline] { 00:00:31.454 [Pipeline] httpRequest 00:00:31.460 HttpMethod: GET 00:00:31.460 URL: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:31.461 Sending request to url: http://10.211.164.20/packages/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:00:31.481 Response Code: HTTP/1.1 200 OK 00:00:31.481 Success: Status code 200 is in the accepted range: 200,404 00:00:31.481 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:01:41.772 [Pipeline] } 00:01:41.787 [Pipeline] // retry 00:01:41.794 [Pipeline] sh 00:01:42.080 + tar --no-same-owner -xf spdk_35cd3e84d4a92eacc8c9de6c2cd81450ef5bcc54.tar.gz 00:01:44.627 [Pipeline] sh 00:01:44.914 + git -C spdk log --oneline -n5 00:01:44.914 35cd3e84d bdev/part: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:01:44.914 01a2c4855 bdev/passthru: Pass through dif_check_flags via dif_check_flags_exclude_mask 00:01:44.914 9094b9600 bdev: Assert to check if I/O pass dif_check_flags not enabled by bdev 00:01:44.914 2e10c84c8 nvmf: Expose DIF type of namespace to host again 00:01:44.914 38b931b23 nvmf: Set bdev_ext_io_opts::dif_check_flags_exclude_mask for read/write 00:01:44.932 [Pipeline] withCredentials 00:01:44.943 > git --version # timeout=10 00:01:44.956 > git --version # 'git version 2.39.2' 00:01:44.974 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:01:44.976 [Pipeline] { 00:01:44.985 [Pipeline] retry 00:01:44.987 [Pipeline] { 00:01:45.003 [Pipeline] sh 00:01:45.288 + git ls-remote http://dpdk.org/git/dpdk main 00:01:46.239 [Pipeline] } 00:01:46.259 [Pipeline] // retry 00:01:46.265 [Pipeline] } 00:01:46.283 [Pipeline] // withCredentials 00:01:46.294 [Pipeline] httpRequest 00:01:46.801 [Pipeline] echo 00:01:46.803 Sorcerer 10.211.164.20 is alive 00:01:46.814 [Pipeline] retry 00:01:46.817 [Pipeline] { 00:01:46.831 [Pipeline] httpRequest 00:01:46.835 HttpMethod: GET 00:01:46.836 URL: http://10.211.164.20/packages/dpdk_4843aacb0d1201fef37e8a579fcd8baec4acdf98.tar.gz 00:01:46.836 Sending request to url: http://10.211.164.20/packages/dpdk_4843aacb0d1201fef37e8a579fcd8baec4acdf98.tar.gz 00:01:46.839 Response Code: HTTP/1.1 200 OK 00:01:46.839 Success: Status code 200 is in the accepted range: 200,404 00:01:46.839 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk_4843aacb0d1201fef37e8a579fcd8baec4acdf98.tar.gz 00:01:50.915 [Pipeline] } 00:01:50.932 [Pipeline] // retry 00:01:50.940 [Pipeline] sh 00:01:51.222 + tar --no-same-owner -xf dpdk_4843aacb0d1201fef37e8a579fcd8baec4acdf98.tar.gz 00:01:52.612 [Pipeline] sh 00:01:52.899 + git -C dpdk log --oneline -n5 00:01:52.899 4843aacb0d doc: describe send scheduling counters in mlx5 guide 00:01:52.899 a4f455560f version: 24.11-rc4 00:01:52.899 0c81db5870 dts: remove leftover node methods 00:01:52.899 71eae7fe3e doc: correct definition of stats per queue feature 00:01:52.900 f2b1510f19 net/octeon_ep: replace use of word segregate 00:01:52.909 [Pipeline] } 00:01:52.924 [Pipeline] // stage 00:01:52.933 [Pipeline] stage 00:01:52.936 [Pipeline] { (Prepare) 00:01:52.957 [Pipeline] writeFile 00:01:52.974 [Pipeline] sh 00:01:53.255 + logger -p user.info -t JENKINS-CI 00:01:53.269 [Pipeline] sh 00:01:53.551 + logger -p user.info -t JENKINS-CI 00:01:53.564 [Pipeline] sh 00:01:53.857 + cat autorun-spdk.conf 00:01:53.857 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:53.857 SPDK_RUN_UBSAN=1 00:01:53.857 SPDK_TEST_FUZZER=1 00:01:53.857 SPDK_TEST_FUZZER_SHORT=1 00:01:53.857 SPDK_TEST_SETUP=1 00:01:53.857 SPDK_TEST_NATIVE_DPDK=main 00:01:53.857 SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:53.864 RUN_NIGHTLY=1 00:01:53.869 [Pipeline] readFile 00:01:53.893 [Pipeline] withEnv 00:01:53.895 [Pipeline] { 00:01:53.908 [Pipeline] sh 00:01:54.195 + set -ex 00:01:54.195 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:54.195 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:54.195 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:54.195 ++ SPDK_RUN_UBSAN=1 00:01:54.195 ++ SPDK_TEST_FUZZER=1 00:01:54.195 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:54.195 ++ SPDK_TEST_SETUP=1 00:01:54.195 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:54.195 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:54.195 ++ RUN_NIGHTLY=1 00:01:54.195 + case $SPDK_TEST_NVMF_NICS in 00:01:54.195 + DRIVERS= 00:01:54.195 + [[ -n '' ]] 00:01:54.195 + exit 0 00:01:54.205 [Pipeline] } 00:01:54.221 [Pipeline] // withEnv 00:01:54.226 [Pipeline] } 00:01:54.240 [Pipeline] // stage 00:01:54.251 [Pipeline] catchError 00:01:54.254 [Pipeline] { 00:01:54.268 [Pipeline] timeout 00:01:54.268 Timeout set to expire in 30 min 00:01:54.270 [Pipeline] { 00:01:54.283 [Pipeline] stage 00:01:54.285 [Pipeline] { (Tests) 00:01:54.296 [Pipeline] sh 00:01:54.579 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:54.579 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:54.579 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:54.579 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:54.579 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:54.579 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:54.579 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:54.579 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:54.579 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:54.579 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:54.579 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:54.579 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:54.579 + source /etc/os-release 00:01:54.579 ++ NAME='Fedora Linux' 00:01:54.579 ++ VERSION='39 (Cloud Edition)' 00:01:54.579 ++ ID=fedora 00:01:54.579 ++ VERSION_ID=39 00:01:54.579 ++ VERSION_CODENAME= 00:01:54.579 ++ PLATFORM_ID=platform:f39 00:01:54.579 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:54.579 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:54.579 ++ LOGO=fedora-logo-icon 00:01:54.579 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:54.579 ++ HOME_URL=https://fedoraproject.org/ 00:01:54.579 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:54.579 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:54.579 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:54.579 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:54.579 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:54.579 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:54.579 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:54.579 ++ SUPPORT_END=2024-11-12 00:01:54.579 ++ VARIANT='Cloud Edition' 00:01:54.579 ++ VARIANT_ID=cloud 00:01:54.580 + uname -a 00:01:54.580 Linux spdk-wfp-49 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:54.580 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:57.884 Hugepages 00:01:57.884 node hugesize free / total 00:01:57.884 node0 1048576kB 0 / 0 00:01:57.884 node0 2048kB 0 / 0 00:01:57.884 node1 1048576kB 0 / 0 00:01:57.884 node1 2048kB 0 / 0 00:01:57.885 00:01:57.885 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:57.885 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:57.885 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:57.885 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:57.885 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:57.885 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:57.885 + rm -f /tmp/spdk-ld-path 00:01:57.885 + source autorun-spdk.conf 00:01:57.885 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:57.885 ++ SPDK_RUN_UBSAN=1 00:01:57.885 ++ SPDK_TEST_FUZZER=1 00:01:57.885 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:57.885 ++ SPDK_TEST_SETUP=1 00:01:57.885 ++ SPDK_TEST_NATIVE_DPDK=main 00:01:57.885 ++ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:57.885 ++ RUN_NIGHTLY=1 00:01:57.885 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:57.885 + [[ -n '' ]] 00:01:57.885 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:57.885 + for M in /var/spdk/build-*-manifest.txt 00:01:57.885 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:57.885 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:57.885 + for M in /var/spdk/build-*-manifest.txt 00:01:57.885 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:57.885 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:57.885 + for M in /var/spdk/build-*-manifest.txt 00:01:57.885 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:57.885 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:57.885 ++ uname 00:01:57.885 + [[ Linux == \L\i\n\u\x ]] 00:01:57.885 + sudo dmesg -T 00:01:57.885 + sudo dmesg --clear 00:01:57.885 + dmesg_pid=478604 00:01:57.885 + sudo dmesg -Tw 00:01:57.885 + [[ Fedora Linux == FreeBSD ]] 00:01:57.885 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:57.885 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:57.885 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:57.885 + [[ -x /usr/src/fio-static/fio ]] 00:01:57.885 + export FIO_BIN=/usr/src/fio-static/fio 00:01:57.885 + FIO_BIN=/usr/src/fio-static/fio 00:01:57.885 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:57.885 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:57.885 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:57.885 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:57.885 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:57.885 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:57.885 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:57.885 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:57.885 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:57.885 12:32:27 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:57.885 12:32:27 -- spdk/autorun.sh@20 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@2 -- $ SPDK_RUN_UBSAN=1 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@3 -- $ SPDK_TEST_FUZZER=1 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@4 -- $ SPDK_TEST_FUZZER_SHORT=1 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@5 -- $ SPDK_TEST_SETUP=1 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@6 -- $ SPDK_TEST_NATIVE_DPDK=main 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@7 -- $ SPDK_RUN_EXTERNAL_DPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:57.885 12:32:27 -- short-fuzz-phy-autotest/autorun-spdk.conf@8 -- $ RUN_NIGHTLY=1 00:01:57.885 12:32:27 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:01:57.885 12:32:27 -- spdk/autorun.sh@25 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autobuild.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:57.885 12:32:27 -- common/autotest_common.sh@1692 -- $ [[ n == y ]] 00:01:57.885 12:32:27 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:57.885 12:32:27 -- scripts/common.sh@15 -- $ shopt -s extglob 00:01:57.885 12:32:27 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:57.885 12:32:27 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:57.885 12:32:27 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:57.885 12:32:27 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:57.885 12:32:27 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:57.885 12:32:27 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:57.885 12:32:27 -- paths/export.sh@5 -- $ export PATH 00:01:57.885 12:32:27 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:57.885 12:32:27 -- common/autobuild_common.sh@492 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:57.885 12:32:27 -- common/autobuild_common.sh@493 -- $ date +%s 00:01:57.885 12:32:27 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1732793547.XXXXXX 00:01:57.885 12:32:27 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1732793547.F1jGAh 00:01:57.885 12:32:27 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:01:57.885 12:32:27 -- common/autobuild_common.sh@499 -- $ '[' -n main ']' 00:01:57.885 12:32:27 -- common/autobuild_common.sh@500 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:57.885 12:32:27 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk' 00:01:57.885 12:32:27 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:57.885 12:32:27 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:57.885 12:32:27 -- common/autobuild_common.sh@509 -- $ get_config_params 00:01:57.885 12:32:27 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:01:57.885 12:32:27 -- common/autotest_common.sh@10 -- $ set +x 00:01:57.885 12:32:27 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user' 00:01:57.885 12:32:27 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:01:57.885 12:32:27 -- pm/common@17 -- $ local monitor 00:01:57.885 12:32:27 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:57.885 12:32:27 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:57.885 12:32:27 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:57.885 12:32:27 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:57.885 12:32:27 -- pm/common@25 -- $ sleep 1 00:01:57.885 12:32:27 -- pm/common@21 -- $ date +%s 00:01:57.885 12:32:27 -- pm/common@21 -- $ date +%s 00:01:57.885 12:32:27 -- pm/common@21 -- $ date +%s 00:01:57.885 12:32:27 -- pm/common@21 -- $ date +%s 00:01:57.885 12:32:27 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732793547 00:01:57.885 12:32:27 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732793547 00:01:57.885 12:32:27 -- pm/common@21 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732793547 00:01:57.885 12:32:27 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1732793547 00:01:57.885 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732793547_collect-cpu-load.pm.log 00:01:57.885 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732793547_collect-vmstat.pm.log 00:01:57.885 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732793547_collect-cpu-temp.pm.log 00:01:57.886 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1732793547_collect-bmc-pm.bmc.pm.log 00:01:58.825 12:32:28 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:01:58.825 12:32:28 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:58.825 12:32:28 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:58.825 12:32:28 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:58.825 12:32:28 -- spdk/autobuild.sh@16 -- $ date -u 00:01:58.825 Thu Nov 28 11:32:28 AM UTC 2024 00:01:58.825 12:32:28 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:58.825 v25.01-pre-276-g35cd3e84d 00:01:58.825 12:32:28 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:58.825 12:32:28 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:58.825 12:32:28 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:58.825 12:32:28 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:01:58.825 12:32:28 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:58.825 12:32:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:58.825 ************************************ 00:01:58.825 START TEST ubsan 00:01:58.825 ************************************ 00:01:58.825 12:32:28 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:01:58.825 using ubsan 00:01:58.825 00:01:58.825 real 0m0.000s 00:01:58.825 user 0m0.000s 00:01:58.825 sys 0m0.000s 00:01:58.825 12:32:28 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:01:58.825 12:32:28 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:58.825 ************************************ 00:01:58.825 END TEST ubsan 00:01:58.825 ************************************ 00:01:58.825 12:32:28 -- spdk/autobuild.sh@27 -- $ '[' -n main ']' 00:01:58.825 12:32:28 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:01:58.825 12:32:28 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:01:58.825 12:32:28 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:01:58.825 12:32:28 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:01:58.825 12:32:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:58.825 ************************************ 00:01:58.825 START TEST build_native_dpdk 00:01:58.825 ************************************ 00:01:58.825 12:32:28 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:01:58.825 12:32:28 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk log --oneline -n 5 00:01:58.826 4843aacb0d doc: describe send scheduling counters in mlx5 guide 00:01:58.826 a4f455560f version: 24.11-rc4 00:01:58.826 0c81db5870 dts: remove leftover node methods 00:01:58.826 71eae7fe3e doc: correct definition of stats per queue feature 00:01:58.826 f2b1510f19 net/octeon_ep: replace use of word segregate 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=24.11.0-rc4 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 24.11.0-rc4 21.11.0 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc4 '<' 21.11.0 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:01:58.826 patching file config/rte_config.h 00:01:58.826 Hunk #1 succeeded at 72 (offset 13 lines). 00:01:58.826 12:32:28 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 24.11.0-rc4 24.07.0 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 24.11.0-rc4 '<' 24.07.0 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:58.826 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:01:59.086 12:32:28 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 24.11.0-rc4 24.07.0 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 24.11.0-rc4 '>=' 24.07.0 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=4 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v++ )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 11 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@355 -- $ echo 11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=11 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 07 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@353 -- $ local d=07 00:01:59.086 12:32:28 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 07 =~ ^[0-9]+$ ]] 00:01:59.086 12:32:29 build_native_dpdk -- scripts/common.sh@355 -- $ echo 7 00:01:59.086 12:32:29 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=7 00:01:59.086 12:32:29 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:01:59.086 12:32:29 build_native_dpdk -- scripts/common.sh@367 -- $ return 0 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@187 -- $ patch -p1 00:01:59.086 patching file drivers/bus/pci/linux/pci_uio.c 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:01:59.086 12:32:29 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:03.279 The Meson build system 00:02:03.279 Version: 1.5.0 00:02:03.279 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk 00:02:03.279 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp 00:02:03.279 Build type: native build 00:02:03.279 Project name: DPDK 00:02:03.279 Project version: 24.11.0-rc4 00:02:03.279 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:03.279 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:03.279 Host machine cpu family: x86_64 00:02:03.279 Host machine cpu: x86_64 00:02:03.279 Message: ## Building in Developer Mode ## 00:02:03.279 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:03.279 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/check-symbols.sh) 00:02:03.279 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/options-ibverbs-static.sh) 00:02:03.279 Program python3 (elftools) found: YES (/usr/bin/python3) modules: elftools 00:02:03.279 Program cat found: YES (/usr/bin/cat) 00:02:03.279 config/meson.build:122: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:03.279 Compiler for C supports arguments -march=native: YES 00:02:03.279 Checking for size of "void *" : 8 00:02:03.279 Checking for size of "void *" : 8 (cached) 00:02:03.279 Compiler for C supports link arguments -Wl,--undefined-version: YES 00:02:03.279 Library m found: YES 00:02:03.279 Library numa found: YES 00:02:03.279 Has header "numaif.h" : YES 00:02:03.279 Library fdt found: NO 00:02:03.279 Library execinfo found: NO 00:02:03.279 Has header "execinfo.h" : YES 00:02:03.279 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:03.279 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:03.279 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:03.279 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:03.279 Run-time dependency openssl found: YES 3.1.1 00:02:03.279 Run-time dependency libpcap found: YES 1.10.4 00:02:03.279 Has header "pcap.h" with dependency libpcap: YES 00:02:03.279 Compiler for C supports arguments -Wcast-qual: YES 00:02:03.279 Compiler for C supports arguments -Wdeprecated: YES 00:02:03.279 Compiler for C supports arguments -Wformat: YES 00:02:03.279 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:03.279 Compiler for C supports arguments -Wformat-security: NO 00:02:03.279 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:03.279 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:03.279 Compiler for C supports arguments -Wnested-externs: YES 00:02:03.279 Compiler for C supports arguments -Wold-style-definition: YES 00:02:03.279 Compiler for C supports arguments -Wpointer-arith: YES 00:02:03.279 Compiler for C supports arguments -Wsign-compare: YES 00:02:03.279 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:03.279 Compiler for C supports arguments -Wundef: YES 00:02:03.279 Compiler for C supports arguments -Wwrite-strings: YES 00:02:03.279 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:03.279 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:03.279 Program objdump found: YES (/usr/bin/objdump) 00:02:03.279 Compiler for C supports arguments -mavx512f -mavx512vl -mavx512dq -mavx512bw: YES 00:02:03.279 Checking if "AVX512 checking" compiles: YES 00:02:03.279 Fetching value of define "__AVX512F__" : 1 00:02:03.279 Fetching value of define "__AVX512BW__" : 1 00:02:03.279 Fetching value of define "__AVX512DQ__" : 1 00:02:03.279 Fetching value of define "__AVX512VL__" : 1 00:02:03.279 Fetching value of define "__SSE4_2__" : 1 00:02:03.279 Fetching value of define "__AES__" : 1 00:02:03.279 Fetching value of define "__AVX__" : 1 00:02:03.279 Fetching value of define "__AVX2__" : 1 00:02:03.279 Fetching value of define "__AVX512BW__" : 1 00:02:03.279 Fetching value of define "__AVX512CD__" : 1 00:02:03.279 Fetching value of define "__AVX512DQ__" : 1 00:02:03.279 Fetching value of define "__AVX512F__" : 1 00:02:03.279 Fetching value of define "__AVX512VL__" : 1 00:02:03.279 Fetching value of define "__PCLMUL__" : 1 00:02:03.279 Fetching value of define "__RDRND__" : 1 00:02:03.279 Fetching value of define "__RDSEED__" : 1 00:02:03.279 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:03.279 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:03.279 Message: lib/log: Defining dependency "log" 00:02:03.279 Message: lib/kvargs: Defining dependency "kvargs" 00:02:03.279 Message: lib/argparse: Defining dependency "argparse" 00:02:03.279 Message: lib/telemetry: Defining dependency "telemetry" 00:02:03.279 Checking for function "pthread_attr_setaffinity_np" : YES 00:02:03.279 Checking for function "getentropy" : NO 00:02:03.279 Message: lib/eal: Defining dependency "eal" 00:02:03.279 Message: lib/ptr_compress: Defining dependency "ptr_compress" 00:02:03.279 Message: lib/ring: Defining dependency "ring" 00:02:03.279 Message: lib/rcu: Defining dependency "rcu" 00:02:03.279 Message: lib/mempool: Defining dependency "mempool" 00:02:03.279 Message: lib/mbuf: Defining dependency "mbuf" 00:02:03.279 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:03.279 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:03.279 Compiler for C supports arguments -mpclmul: YES 00:02:03.279 Compiler for C supports arguments -maes: YES 00:02:03.279 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:03.279 Message: lib/net: Defining dependency "net" 00:02:03.279 Message: lib/meter: Defining dependency "meter" 00:02:03.279 Message: lib/ethdev: Defining dependency "ethdev" 00:02:03.279 Message: lib/pci: Defining dependency "pci" 00:02:03.279 Message: lib/cmdline: Defining dependency "cmdline" 00:02:03.279 Message: lib/metrics: Defining dependency "metrics" 00:02:03.279 Message: lib/hash: Defining dependency "hash" 00:02:03.279 Message: lib/timer: Defining dependency "timer" 00:02:03.279 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:03.279 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:03.279 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:03.279 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:03.279 Message: lib/acl: Defining dependency "acl" 00:02:03.279 Message: lib/bbdev: Defining dependency "bbdev" 00:02:03.279 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:03.279 Run-time dependency libelf found: YES 0.191 00:02:03.279 Message: lib/bpf: Defining dependency "bpf" 00:02:03.279 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:03.279 Message: lib/compressdev: Defining dependency "compressdev" 00:02:03.279 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:03.279 Message: lib/distributor: Defining dependency "distributor" 00:02:03.279 Message: lib/dmadev: Defining dependency "dmadev" 00:02:03.279 Message: lib/efd: Defining dependency "efd" 00:02:03.279 Message: lib/eventdev: Defining dependency "eventdev" 00:02:03.279 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:03.279 Message: lib/gpudev: Defining dependency "gpudev" 00:02:03.279 Message: lib/gro: Defining dependency "gro" 00:02:03.279 Message: lib/gso: Defining dependency "gso" 00:02:03.279 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:03.279 Message: lib/jobstats: Defining dependency "jobstats" 00:02:03.279 Message: lib/latencystats: Defining dependency "latencystats" 00:02:03.279 Message: lib/lpm: Defining dependency "lpm" 00:02:03.279 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:03.279 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:03.279 Fetching value of define "__AVX512IFMA__" : (undefined) 00:02:03.279 Compiler for C supports arguments -mavx512f -mavx512dq -mavx512ifma: YES 00:02:03.280 Message: lib/member: Defining dependency "member" 00:02:03.280 Message: lib/pcapng: Defining dependency "pcapng" 00:02:03.280 Message: lib/power: Defining dependency "power" 00:02:03.280 Message: lib/rawdev: Defining dependency "rawdev" 00:02:03.280 Message: lib/regexdev: Defining dependency "regexdev" 00:02:03.280 Message: lib/mldev: Defining dependency "mldev" 00:02:03.280 Message: lib/rib: Defining dependency "rib" 00:02:03.280 Message: lib/reorder: Defining dependency "reorder" 00:02:03.280 Message: lib/sched: Defining dependency "sched" 00:02:03.280 Message: lib/security: Defining dependency "security" 00:02:03.280 Message: lib/stack: Defining dependency "stack" 00:02:03.280 Has header "linux/userfaultfd.h" : YES 00:02:03.280 Has header "linux/vduse.h" : YES 00:02:03.280 Message: lib/vhost: Defining dependency "vhost" 00:02:03.280 Message: lib/ipsec: Defining dependency "ipsec" 00:02:03.280 Message: lib/pdcp: Defining dependency "pdcp" 00:02:03.280 Message: lib/fib: Defining dependency "fib" 00:02:03.280 Message: lib/port: Defining dependency "port" 00:02:03.280 Message: lib/pdump: Defining dependency "pdump" 00:02:03.280 Message: lib/table: Defining dependency "table" 00:02:03.280 Message: lib/pipeline: Defining dependency "pipeline" 00:02:03.280 Message: lib/graph: Defining dependency "graph" 00:02:03.280 Message: lib/node: Defining dependency "node" 00:02:03.280 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:03.280 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:03.280 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:03.280 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:03.280 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:03.280 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:03.280 Compiler for C supports arguments -Wno-unused-value: YES 00:02:03.280 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:03.280 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:03.280 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:03.280 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:03.851 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:03.851 Message: drivers/power/acpi: Defining dependency "power_acpi" 00:02:03.851 Message: drivers/power/amd_pstate: Defining dependency "power_amd_pstate" 00:02:03.851 Message: drivers/power/cppc: Defining dependency "power_cppc" 00:02:03.851 Message: drivers/power/intel_pstate: Defining dependency "power_intel_pstate" 00:02:03.851 Message: drivers/power/intel_uncore: Defining dependency "power_intel_uncore" 00:02:03.851 Message: drivers/power/kvm_vm: Defining dependency "power_kvm_vm" 00:02:03.851 Has header "sys/epoll.h" : YES 00:02:03.851 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:03.851 Configuring doxy-api-html.conf using configuration 00:02:03.851 Configuring doxy-api-man.conf using configuration 00:02:03.851 Program mandb found: YES (/usr/bin/mandb) 00:02:03.851 Program sphinx-build found: NO 00:02:03.851 Program sphinx-build found: NO 00:02:03.851 Configuring rte_build_config.h using configuration 00:02:03.851 Message: 00:02:03.851 ================= 00:02:03.851 Applications Enabled 00:02:03.851 ================= 00:02:03.851 00:02:03.851 apps: 00:02:03.851 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:03.851 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:03.851 test-pmd, test-regex, test-sad, test-security-perf, 00:02:03.851 00:02:03.851 Message: 00:02:03.851 ================= 00:02:03.851 Libraries Enabled 00:02:03.851 ================= 00:02:03.851 00:02:03.851 libs: 00:02:03.851 log, kvargs, argparse, telemetry, eal, ptr_compress, ring, rcu, 00:02:03.851 mempool, mbuf, net, meter, ethdev, pci, cmdline, metrics, 00:02:03.851 hash, timer, acl, bbdev, bitratestats, bpf, cfgfile, compressdev, 00:02:03.851 cryptodev, distributor, dmadev, efd, eventdev, dispatcher, gpudev, gro, 00:02:03.851 gso, ip_frag, jobstats, latencystats, lpm, member, pcapng, power, 00:02:03.851 rawdev, regexdev, mldev, rib, reorder, sched, security, stack, 00:02:03.851 vhost, ipsec, pdcp, fib, port, pdump, table, pipeline, 00:02:03.851 graph, node, 00:02:03.851 00:02:03.851 Message: 00:02:03.851 =============== 00:02:03.851 Drivers Enabled 00:02:03.851 =============== 00:02:03.851 00:02:03.851 common: 00:02:03.851 00:02:03.851 bus: 00:02:03.851 pci, vdev, 00:02:03.851 mempool: 00:02:03.851 ring, 00:02:03.851 dma: 00:02:03.851 00:02:03.851 net: 00:02:03.851 i40e, 00:02:03.851 raw: 00:02:03.851 00:02:03.851 crypto: 00:02:03.851 00:02:03.851 compress: 00:02:03.851 00:02:03.851 regex: 00:02:03.851 00:02:03.851 ml: 00:02:03.851 00:02:03.851 vdpa: 00:02:03.851 00:02:03.851 event: 00:02:03.851 00:02:03.851 baseband: 00:02:03.851 00:02:03.851 gpu: 00:02:03.851 00:02:03.851 power: 00:02:03.851 acpi, amd_pstate, cppc, intel_pstate, intel_uncore, kvm_vm, 00:02:03.851 00:02:03.851 Message: 00:02:03.851 ================= 00:02:03.851 Content Skipped 00:02:03.851 ================= 00:02:03.851 00:02:03.851 apps: 00:02:03.851 00:02:03.851 libs: 00:02:03.851 00:02:03.851 drivers: 00:02:03.851 common/cpt: not in enabled drivers build config 00:02:03.851 common/dpaax: not in enabled drivers build config 00:02:03.851 common/iavf: not in enabled drivers build config 00:02:03.851 common/idpf: not in enabled drivers build config 00:02:03.851 common/ionic: not in enabled drivers build config 00:02:03.851 common/mvep: not in enabled drivers build config 00:02:03.851 common/octeontx: not in enabled drivers build config 00:02:03.851 bus/auxiliary: not in enabled drivers build config 00:02:03.851 bus/cdx: not in enabled drivers build config 00:02:03.851 bus/dpaa: not in enabled drivers build config 00:02:03.851 bus/fslmc: not in enabled drivers build config 00:02:03.851 bus/ifpga: not in enabled drivers build config 00:02:03.851 bus/platform: not in enabled drivers build config 00:02:03.851 bus/uacce: not in enabled drivers build config 00:02:03.851 bus/vmbus: not in enabled drivers build config 00:02:03.851 common/cnxk: not in enabled drivers build config 00:02:03.851 common/mlx5: not in enabled drivers build config 00:02:03.851 common/nfp: not in enabled drivers build config 00:02:03.851 common/nitrox: not in enabled drivers build config 00:02:03.851 common/qat: not in enabled drivers build config 00:02:03.851 common/sfc_efx: not in enabled drivers build config 00:02:03.851 mempool/bucket: not in enabled drivers build config 00:02:03.851 mempool/cnxk: not in enabled drivers build config 00:02:03.851 mempool/dpaa: not in enabled drivers build config 00:02:03.851 mempool/dpaa2: not in enabled drivers build config 00:02:03.851 mempool/octeontx: not in enabled drivers build config 00:02:03.851 mempool/stack: not in enabled drivers build config 00:02:03.851 dma/cnxk: not in enabled drivers build config 00:02:03.851 dma/dpaa: not in enabled drivers build config 00:02:03.851 dma/dpaa2: not in enabled drivers build config 00:02:03.851 dma/hisilicon: not in enabled drivers build config 00:02:03.851 dma/idxd: not in enabled drivers build config 00:02:03.851 dma/ioat: not in enabled drivers build config 00:02:03.851 dma/odm: not in enabled drivers build config 00:02:03.851 dma/skeleton: not in enabled drivers build config 00:02:03.851 net/af_packet: not in enabled drivers build config 00:02:03.851 net/af_xdp: not in enabled drivers build config 00:02:03.851 net/ark: not in enabled drivers build config 00:02:03.851 net/atlantic: not in enabled drivers build config 00:02:03.851 net/avp: not in enabled drivers build config 00:02:03.851 net/axgbe: not in enabled drivers build config 00:02:03.851 net/bnx2x: not in enabled drivers build config 00:02:03.851 net/bnxt: not in enabled drivers build config 00:02:03.851 net/bonding: not in enabled drivers build config 00:02:03.851 net/cnxk: not in enabled drivers build config 00:02:03.851 net/cpfl: not in enabled drivers build config 00:02:03.851 net/cxgbe: not in enabled drivers build config 00:02:03.851 net/dpaa: not in enabled drivers build config 00:02:03.851 net/dpaa2: not in enabled drivers build config 00:02:03.851 net/e1000: not in enabled drivers build config 00:02:03.851 net/ena: not in enabled drivers build config 00:02:03.851 net/enetc: not in enabled drivers build config 00:02:03.851 net/enetfec: not in enabled drivers build config 00:02:03.851 net/enic: not in enabled drivers build config 00:02:03.851 net/failsafe: not in enabled drivers build config 00:02:03.851 net/fm10k: not in enabled drivers build config 00:02:03.851 net/gve: not in enabled drivers build config 00:02:03.851 net/hinic: not in enabled drivers build config 00:02:03.851 net/hns3: not in enabled drivers build config 00:02:03.851 net/iavf: not in enabled drivers build config 00:02:03.851 net/ice: not in enabled drivers build config 00:02:03.851 net/idpf: not in enabled drivers build config 00:02:03.851 net/igc: not in enabled drivers build config 00:02:03.851 net/ionic: not in enabled drivers build config 00:02:03.851 net/ipn3ke: not in enabled drivers build config 00:02:03.851 net/ixgbe: not in enabled drivers build config 00:02:03.851 net/mana: not in enabled drivers build config 00:02:03.851 net/memif: not in enabled drivers build config 00:02:03.851 net/mlx4: not in enabled drivers build config 00:02:03.851 net/mlx5: not in enabled drivers build config 00:02:03.851 net/mvneta: not in enabled drivers build config 00:02:03.851 net/mvpp2: not in enabled drivers build config 00:02:03.851 net/netvsc: not in enabled drivers build config 00:02:03.851 net/nfb: not in enabled drivers build config 00:02:03.851 net/nfp: not in enabled drivers build config 00:02:03.851 net/ngbe: not in enabled drivers build config 00:02:03.851 net/ntnic: not in enabled drivers build config 00:02:03.851 net/null: not in enabled drivers build config 00:02:03.851 net/octeontx: not in enabled drivers build config 00:02:03.851 net/octeon_ep: not in enabled drivers build config 00:02:03.851 net/pcap: not in enabled drivers build config 00:02:03.851 net/pfe: not in enabled drivers build config 00:02:03.851 net/qede: not in enabled drivers build config 00:02:03.851 net/r8169: not in enabled drivers build config 00:02:03.851 net/ring: not in enabled drivers build config 00:02:03.851 net/sfc: not in enabled drivers build config 00:02:03.852 net/softnic: not in enabled drivers build config 00:02:03.852 net/tap: not in enabled drivers build config 00:02:03.852 net/thunderx: not in enabled drivers build config 00:02:03.852 net/txgbe: not in enabled drivers build config 00:02:03.852 net/vdev_netvsc: not in enabled drivers build config 00:02:03.852 net/vhost: not in enabled drivers build config 00:02:03.852 net/virtio: not in enabled drivers build config 00:02:03.852 net/vmxnet3: not in enabled drivers build config 00:02:03.852 net/zxdh: not in enabled drivers build config 00:02:03.852 raw/cnxk_bphy: not in enabled drivers build config 00:02:03.852 raw/cnxk_gpio: not in enabled drivers build config 00:02:03.852 raw/cnxk_rvu_lf: not in enabled drivers build config 00:02:03.852 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:03.852 raw/gdtc: not in enabled drivers build config 00:02:03.852 raw/ifpga: not in enabled drivers build config 00:02:03.852 raw/ntb: not in enabled drivers build config 00:02:03.852 raw/skeleton: not in enabled drivers build config 00:02:03.852 crypto/armv8: not in enabled drivers build config 00:02:03.852 crypto/bcmfs: not in enabled drivers build config 00:02:03.852 crypto/caam_jr: not in enabled drivers build config 00:02:03.852 crypto/ccp: not in enabled drivers build config 00:02:03.852 crypto/cnxk: not in enabled drivers build config 00:02:03.852 crypto/dpaa_sec: not in enabled drivers build config 00:02:03.852 crypto/dpaa2_sec: not in enabled drivers build config 00:02:03.852 crypto/ionic: not in enabled drivers build config 00:02:03.852 crypto/ipsec_mb: not in enabled drivers build config 00:02:03.852 crypto/mlx5: not in enabled drivers build config 00:02:03.852 crypto/mvsam: not in enabled drivers build config 00:02:03.852 crypto/nitrox: not in enabled drivers build config 00:02:03.852 crypto/null: not in enabled drivers build config 00:02:03.852 crypto/octeontx: not in enabled drivers build config 00:02:03.852 crypto/openssl: not in enabled drivers build config 00:02:03.852 crypto/scheduler: not in enabled drivers build config 00:02:03.852 crypto/uadk: not in enabled drivers build config 00:02:03.852 crypto/virtio: not in enabled drivers build config 00:02:03.852 compress/isal: not in enabled drivers build config 00:02:03.852 compress/mlx5: not in enabled drivers build config 00:02:03.852 compress/nitrox: not in enabled drivers build config 00:02:03.852 compress/octeontx: not in enabled drivers build config 00:02:03.852 compress/uadk: not in enabled drivers build config 00:02:03.852 compress/zlib: not in enabled drivers build config 00:02:03.852 regex/mlx5: not in enabled drivers build config 00:02:03.852 regex/cn9k: not in enabled drivers build config 00:02:03.852 ml/cnxk: not in enabled drivers build config 00:02:03.852 vdpa/ifc: not in enabled drivers build config 00:02:03.852 vdpa/mlx5: not in enabled drivers build config 00:02:03.852 vdpa/nfp: not in enabled drivers build config 00:02:03.852 vdpa/sfc: not in enabled drivers build config 00:02:03.852 event/cnxk: not in enabled drivers build config 00:02:03.852 event/dlb2: not in enabled drivers build config 00:02:03.852 event/dpaa: not in enabled drivers build config 00:02:03.852 event/dpaa2: not in enabled drivers build config 00:02:03.852 event/dsw: not in enabled drivers build config 00:02:03.852 event/opdl: not in enabled drivers build config 00:02:03.852 event/skeleton: not in enabled drivers build config 00:02:03.852 event/sw: not in enabled drivers build config 00:02:03.852 event/octeontx: not in enabled drivers build config 00:02:03.852 baseband/acc: not in enabled drivers build config 00:02:03.852 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:03.852 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:03.852 baseband/la12xx: not in enabled drivers build config 00:02:03.852 baseband/null: not in enabled drivers build config 00:02:03.852 baseband/turbo_sw: not in enabled drivers build config 00:02:03.852 gpu/cuda: not in enabled drivers build config 00:02:03.852 power/amd_uncore: not in enabled drivers build config 00:02:03.852 00:02:03.852 00:02:03.852 Message: DPDK build config complete: 00:02:03.852 source path = "/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk" 00:02:03.852 build path = "/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp" 00:02:03.852 Build targets in project: 246 00:02:03.852 00:02:03.852 DPDK 24.11.0-rc4 00:02:03.852 00:02:03.852 User defined options 00:02:03.852 libdir : lib 00:02:03.852 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:02:03.852 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:03.852 c_link_args : 00:02:03.852 enable_docs : false 00:02:04.838 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:04.838 enable_kmods : false 00:02:04.838 machine : native 00:02:04.838 tests : false 00:02:04.838 00:02:04.838 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:04.838 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:04.838 12:32:34 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 00:02:04.838 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:04.838 [1/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:04.838 [2/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:04.838 [3/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:04.838 [4/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:05.122 [5/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:05.122 [6/766] Compiling C object lib/librte_log.a.p/log_log_syslog.c.o 00:02:05.122 [7/766] Compiling C object lib/librte_log.a.p/log_log_color.c.o 00:02:05.122 [8/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:05.122 [9/766] Compiling C object lib/librte_log.a.p/log_log_timestamp.c.o 00:02:05.122 [10/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:05.122 [11/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:05.122 [12/766] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:05.122 [13/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:05.122 [14/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:05.122 [15/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:05.122 [16/766] Linking static target lib/librte_kvargs.a 00:02:05.122 [17/766] Compiling C object lib/librte_log.a.p/log_log_journal.c.o 00:02:05.122 [18/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:05.122 [19/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:05.122 [20/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:05.122 [21/766] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:05.122 [22/766] Linking static target lib/librte_log.a 00:02:05.421 [23/766] Compiling C object lib/librte_argparse.a.p/argparse_rte_argparse.c.o 00:02:05.421 [24/766] Linking static target lib/librte_argparse.a 00:02:05.421 [25/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore_var.c.o 00:02:05.421 [26/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:05.421 [27/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:05.421 [28/766] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.421 [29/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:05.421 [30/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:05.421 [31/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:05.421 [32/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:05.421 [33/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:05.421 [34/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_bitset.c.o 00:02:05.421 [35/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:05.421 [36/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:05.421 [37/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:05.421 [38/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:05.421 [39/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:05.421 [40/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:05.421 [41/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:05.421 [42/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:05.421 [43/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:05.683 [44/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:05.683 [45/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:05.683 [46/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:05.683 [47/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_mmu.c.o 00:02:05.683 [48/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:05.683 [49/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:05.683 [50/766] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:05.683 [51/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:05.683 [52/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:05.683 [53/766] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:05.683 [54/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:05.683 [55/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:05.683 [56/766] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:05.683 [57/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:05.683 [58/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:05.683 [59/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:05.683 [60/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:05.683 [61/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:05.683 [62/766] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:05.683 [63/766] Generating lib/argparse.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.683 [64/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:05.683 [65/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:05.683 [66/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:05.683 [67/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:05.683 [68/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:05.683 [69/766] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:05.683 [70/766] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:05.683 [71/766] Linking static target lib/librte_pci.a 00:02:05.683 [72/766] Linking static target lib/librte_ring.a 00:02:05.683 [73/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:05.683 [74/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:05.683 [75/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:05.683 [76/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:05.683 [77/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:05.683 [78/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:05.683 [79/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:05.683 [80/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:05.683 [81/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:05.683 [82/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:05.683 [83/766] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:05.683 [84/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:05.683 [85/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:05.683 [86/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:05.683 [87/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:05.942 [88/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:05.942 [89/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:05.942 [90/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:05.942 [91/766] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:05.942 [92/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:05.942 [93/766] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:05.942 [94/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:05.942 [95/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:05.942 [96/766] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:05.942 [97/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:05.942 [98/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:05.942 [99/766] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.208 [100/766] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.208 [101/766] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:06.208 [102/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:06.208 [103/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:06.208 [104/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:06.208 [105/766] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:06.208 [106/766] Linking target lib/librte_log.so.25.0 00:02:06.208 [107/766] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:06.208 [108/766] Linking static target lib/librte_meter.a 00:02:06.208 [109/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:06.208 [110/766] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.208 [111/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:06.208 [112/766] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:06.208 [113/766] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:06.208 [114/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:06.208 [115/766] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:06.208 [116/766] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:06.208 [117/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:06.208 [118/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:06.208 [119/766] Linking static target lib/librte_cmdline.a 00:02:06.208 [120/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:06.208 [121/766] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:06.208 [122/766] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:06.208 [123/766] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:06.208 [124/766] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:06.208 [125/766] Linking static target lib/librte_net.a 00:02:06.208 [126/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:06.208 [127/766] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:06.208 [128/766] Linking static target lib/librte_metrics.a 00:02:06.470 [129/766] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:06.470 [130/766] Generating symbol file lib/librte_log.so.25.0.p/librte_log.so.25.0.symbols 00:02:06.470 [131/766] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:06.470 [132/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:06.470 [133/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:06.470 [134/766] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:06.470 [135/766] Linking target lib/librte_kvargs.so.25.0 00:02:06.470 [136/766] Linking static target lib/librte_mempool.a 00:02:06.470 [137/766] Linking static target lib/librte_cfgfile.a 00:02:06.470 [138/766] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:06.470 [139/766] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:06.470 [140/766] Linking target lib/librte_argparse.so.25.0 00:02:06.470 [141/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:06.470 [142/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:06.470 [143/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:06.470 [144/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:06.470 [145/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:06.470 [146/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:06.470 [147/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:06.470 [148/766] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:06.470 [149/766] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.470 [150/766] Linking static target lib/librte_bitratestats.a 00:02:06.470 [151/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:06.470 [152/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gf2_poly_math.c.o 00:02:06.738 [153/766] Generating symbol file lib/librte_kvargs.so.25.0.p/librte_kvargs.so.25.0.symbols 00:02:06.738 [154/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:06.738 [155/766] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:06.738 [156/766] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:06.738 [157/766] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:06.738 [158/766] Linking static target lib/librte_rcu.a 00:02:06.738 [159/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:06.738 [160/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:06.738 [161/766] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:06.738 [162/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:06.738 [163/766] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:06.738 [164/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:06.738 [165/766] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:06.738 [166/766] Linking static target lib/librte_telemetry.a 00:02:06.738 [167/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:06.738 [168/766] Linking static target lib/librte_eal.a 00:02:06.738 [169/766] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:06.738 [170/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:06.738 [171/766] Linking static target lib/librte_timer.a 00:02:07.001 [172/766] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.001 [173/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:07.001 [174/766] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:07.001 [175/766] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:07.001 [176/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:07.001 [177/766] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.001 [178/766] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:07.001 [179/766] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.001 [180/766] Linking static target lib/librte_compressdev.a 00:02:07.001 [181/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:07.001 [182/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:07.001 [183/766] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:07.001 [184/766] Linking static target lib/librte_mbuf.a 00:02:07.001 [185/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:07.001 [186/766] Compiling C object lib/librte_power.a.p/power_rte_power_qos.c.o 00:02:07.001 [187/766] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:07.001 [188/766] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:07.001 [189/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:07.268 [190/766] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:07.268 [191/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:07.268 [192/766] Linking static target lib/librte_bbdev.a 00:02:07.268 [193/766] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:07.268 [194/766] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:07.268 [195/766] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.268 [196/766] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:07.268 [197/766] Linking static target lib/librte_dispatcher.a 00:02:07.268 [198/766] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:07.268 [199/766] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:07.268 [200/766] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:07.268 [201/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:07.268 [202/766] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:07.268 [203/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:07.268 [204/766] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:07.533 [205/766] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:07.533 [206/766] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:07.533 [207/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:07.533 [208/766] Linking static target lib/librte_jobstats.a 00:02:07.533 [209/766] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:07.533 [210/766] Linking static target lib/librte_gpudev.a 00:02:07.533 [211/766] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:07.533 [212/766] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:07.533 [213/766] Linking static target lib/librte_dmadev.a 00:02:07.533 [214/766] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.533 [215/766] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:07.533 [216/766] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:07.533 [217/766] Linking static target lib/librte_distributor.a 00:02:07.533 [218/766] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:07.533 [219/766] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.533 [220/766] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.533 [221/766] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:07.533 [222/766] Linking target lib/librte_telemetry.so.25.0 00:02:07.533 [223/766] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:07.533 [224/766] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:07.534 [225/766] Linking static target lib/librte_gro.a 00:02:07.534 [226/766] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:07.534 [227/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:07.534 [228/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:07.534 [229/766] Compiling C object lib/librte_power.a.p/power_rte_power_cpufreq.c.o 00:02:07.534 [230/766] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.534 [231/766] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:07.797 [232/766] Linking static target lib/librte_latencystats.a 00:02:07.797 [233/766] Compiling C object lib/member/libsketch_avx512_tmp.a.p/rte_member_sketch_avx512.c.o 00:02:07.797 [234/766] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:07.798 [235/766] Linking static target lib/member/libsketch_avx512_tmp.a 00:02:07.798 [236/766] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:07.798 [237/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:07.798 [238/766] Linking static target lib/librte_gso.a 00:02:07.798 [239/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:07.798 [240/766] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:07.798 [241/766] Linking static target lib/librte_bpf.a 00:02:07.798 [242/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:07.798 [243/766] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:07.798 [244/766] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:07.798 [245/766] Generating symbol file lib/librte_telemetry.so.25.0.p/librte_telemetry.so.25.0.symbols 00:02:07.798 [246/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:07.798 [247/766] Linking static target lib/librte_ip_frag.a 00:02:07.798 [248/766] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.798 [249/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:07.798 [250/766] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:07.798 [251/766] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:07.798 [252/766] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:07.798 [253/766] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.798 [254/766] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:07.798 [255/766] Linking static target lib/librte_power.a 00:02:08.062 [256/766] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:08.062 [257/766] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [258/766] Linking static target lib/librte_regexdev.a 00:02:08.062 [259/766] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [260/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:08.062 [261/766] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [262/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:08.062 [263/766] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:08.062 [264/766] Compiling C object lib/librte_port.a.p/port_port_log.c.o 00:02:08.062 [265/766] Linking static target lib/librte_stack.a 00:02:08.062 [266/766] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:08.062 [267/766] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [268/766] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [269/766] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:08.062 [270/766] Linking static target lib/librte_rawdev.a 00:02:08.062 [271/766] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.062 [272/766] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:08.327 [273/766] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:08.328 [274/766] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:08.328 [275/766] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.328 [276/766] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:08.328 [277/766] Linking static target lib/librte_pcapng.a 00:02:08.328 [278/766] Linking static target lib/librte_reorder.a 00:02:08.328 [279/766] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.328 [280/766] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.328 [281/766] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:08.328 [282/766] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:08.328 [283/766] Linking static target lib/librte_mldev.a 00:02:08.328 [284/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:08.328 [285/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:08.328 [286/766] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:08.328 [287/766] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:08.328 [288/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:08.328 [289/766] Linking static target lib/librte_efd.a 00:02:08.328 [290/766] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:08.328 [291/766] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.328 [292/766] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:08.328 [293/766] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:08.593 [294/766] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:08.593 [295/766] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:08.593 [296/766] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:08.593 [297/766] Linking static target lib/librte_rib.a 00:02:08.593 [298/766] Linking static target lib/librte_security.a 00:02:08.593 [299/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:08.593 [300/766] Linking static target lib/librte_lpm.a 00:02:08.593 [301/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:08.593 [302/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:08.593 [303/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:08.593 [304/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:08.593 [305/766] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:08.593 [306/766] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:08.593 [307/766] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:08.593 [308/766] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:08.593 [309/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:08.593 [310/766] Compiling C object lib/librte_table.a.p/table_table_log.c.o 00:02:08.593 [311/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:08.593 [312/766] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.593 [313/766] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:08.593 [314/766] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:08.857 [315/766] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:08.857 [316/766] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:08.857 [317/766] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.857 [318/766] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.857 [319/766] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:08.857 [320/766] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:08.857 [321/766] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.857 [322/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:08.857 [323/766] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.857 [324/766] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:08.857 [325/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:09.123 [326/766] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.123 [327/766] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:09.123 [328/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:09.123 [329/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:09.123 [330/766] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.123 [331/766] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:09.123 [332/766] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.123 [333/766] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.123 [334/766] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:09.123 [335/766] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:09.123 [336/766] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:09.123 [337/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:09.123 [338/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:09.123 [339/766] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:09.382 [340/766] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:09.382 [341/766] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:09.382 [342/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:09.382 [343/766] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:09.382 [344/766] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:09.382 [345/766] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:09.382 [346/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:09.382 [347/766] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:09.382 [348/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:09.382 [349/766] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:09.382 [350/766] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:09.383 [351/766] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:09.383 [352/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:09.383 [353/766] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:09.383 [354/766] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:09.383 [355/766] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:09.644 [356/766] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:09.644 [357/766] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:09.644 [358/766] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:09.644 [359/766] Linking static target lib/librte_cryptodev.a 00:02:09.644 [360/766] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:09.644 [361/766] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:09.644 [362/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:09.644 [363/766] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:09.644 [364/766] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:09.644 [365/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:09.644 [366/766] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:09.910 [367/766] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:09.910 [368/766] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:09.910 [369/766] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:09.910 [370/766] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:09.910 [371/766] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:09.910 [372/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:09.910 [373/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:09.910 [374/766] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:09.910 [375/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:09.910 [376/766] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:09.910 [377/766] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:09.910 [378/766] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:10.175 [379/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:10.175 [380/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:10.175 [381/766] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:10.175 [382/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:10.175 [383/766] Linking static target lib/librte_pdump.a 00:02:10.175 [384/766] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:10.175 [385/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:10.175 [386/766] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:10.175 [387/766] Linking static target lib/librte_sched.a 00:02:10.175 [388/766] Compiling C object drivers/libtmp_rte_power_kvm_vm.a.p/power_kvm_vm_guest_channel.c.o 00:02:10.176 [389/766] Compiling C object drivers/libtmp_rte_power_kvm_vm.a.p/power_kvm_vm_kvm_vm.c.o 00:02:10.176 [390/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:10.176 [391/766] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:10.176 [392/766] Linking static target drivers/libtmp_rte_power_kvm_vm.a 00:02:10.176 [393/766] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:10.176 [394/766] Linking static target lib/librte_graph.a 00:02:10.176 [395/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:10.176 [396/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:10.176 [397/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:10.176 [398/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:10.435 [399/766] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:10.435 [400/766] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:10.435 [401/766] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:10.435 [402/766] Generating app/graph/commands_hdr with a custom command (wrapped by meson to capture output) 00:02:10.435 [403/766] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:10.435 [404/766] Linking static target lib/librte_member.a 00:02:10.435 [405/766] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:10.435 [406/766] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:10.435 [407/766] Linking static target lib/librte_hash.a 00:02:10.435 [408/766] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:10.435 [409/766] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:10.435 [410/766] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:10.435 [411/766] Linking static target lib/librte_fib.a 00:02:10.435 [412/766] Linking static target lib/librte_table.a 00:02:10.435 [413/766] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:10.435 [414/766] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:10.705 [415/766] Linking static target lib/acl/libavx2_tmp.a 00:02:10.705 [416/766] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.705 [417/766] Generating drivers/rte_power_kvm_vm.pmd.c with a custom command 00:02:10.705 [418/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:10.705 [419/766] Compiling C object drivers/librte_power_kvm_vm.a.p/meson-generated_.._rte_power_kvm_vm.pmd.c.o 00:02:10.705 [420/766] Linking static target drivers/librte_power_kvm_vm.a 00:02:10.705 [421/766] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:10.705 [422/766] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:10.705 [423/766] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:10.705 [424/766] Compiling C object drivers/librte_power_kvm_vm.so.25.0.p/meson-generated_.._rte_power_kvm_vm.pmd.c.o 00:02:10.705 [425/766] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:10.705 [426/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:10.705 [427/766] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:10.705 [428/766] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:10.705 [429/766] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:10.705 [430/766] Linking static target drivers/librte_bus_vdev.a 00:02:10.705 [431/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:10.705 [432/766] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:10.705 [433/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:10.705 [434/766] Linking static target lib/librte_eventdev.a 00:02:10.705 [435/766] Compiling C object drivers/librte_bus_vdev.so.25.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:10.705 [436/766] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.705 [437/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:10.705 [438/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:10.970 [439/766] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.970 [440/766] Compiling C object drivers/libtmp_rte_power_acpi.a.p/power_acpi_acpi_cpufreq.c.o 00:02:10.970 [441/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:10.970 [442/766] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:10.970 [443/766] Linking static target drivers/libtmp_rte_power_acpi.a 00:02:10.970 [444/766] Compiling C object drivers/libtmp_rte_power_cppc.a.p/power_cppc_cppc_cpufreq.c.o 00:02:10.970 [445/766] Linking static target drivers/libtmp_rte_power_cppc.a 00:02:10.970 [446/766] Compiling C object drivers/libtmp_rte_power_amd_pstate.a.p/power_amd_pstate_amd_pstate_cpufreq.c.o 00:02:10.970 [447/766] Linking static target drivers/libtmp_rte_power_amd_pstate.a 00:02:10.970 [448/766] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.970 [449/766] Generating drivers/rte_power_kvm_vm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:10.970 [450/766] Compiling C object drivers/libtmp_rte_power_intel_uncore.a.p/power_intel_uncore_intel_uncore.c.o 00:02:10.970 [451/766] Linking static target drivers/libtmp_rte_power_intel_uncore.a 00:02:10.970 [452/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:10.970 [453/766] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:10.970 [454/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:10.970 [455/766] Linking static target lib/librte_pdcp.a 00:02:10.970 [456/766] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:10.970 [457/766] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:10.970 [458/766] Linking static target lib/librte_ipsec.a 00:02:10.970 [459/766] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:11.233 [460/766] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:11.233 [461/766] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:11.233 [462/766] Linking static target lib/librte_acl.a 00:02:11.233 [463/766] Compiling C object drivers/libtmp_rte_power_intel_pstate.a.p/power_intel_pstate_intel_pstate_cpufreq.c.o 00:02:11.233 [464/766] Linking static target drivers/librte_bus_pci.a 00:02:11.233 [465/766] Linking static target drivers/libtmp_rte_power_intel_pstate.a 00:02:11.233 [466/766] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.233 [467/766] Compiling C object drivers/librte_bus_pci.so.25.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:11.233 [468/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:11.233 [469/766] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.233 [470/766] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:11.233 [471/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:11.233 [472/766] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:11.233 [473/766] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:11.233 [474/766] Generating drivers/rte_power_cppc.pmd.c with a custom command 00:02:11.233 [475/766] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:11.233 [476/766] Generating drivers/rte_power_acpi.pmd.c with a custom command 00:02:11.233 [477/766] Compiling C object drivers/librte_power_cppc.a.p/meson-generated_.._rte_power_cppc.pmd.c.o 00:02:11.233 [478/766] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.233 [479/766] Linking static target drivers/librte_power_cppc.a 00:02:11.233 [480/766] Compiling C object drivers/librte_power_cppc.so.25.0.p/meson-generated_.._rte_power_cppc.pmd.c.o 00:02:11.233 [481/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:11.233 [482/766] Compiling C object drivers/librte_power_acpi.so.25.0.p/meson-generated_.._rte_power_acpi.pmd.c.o 00:02:11.233 [483/766] Generating drivers/rte_power_amd_pstate.pmd.c with a custom command 00:02:11.233 [484/766] Compiling C object drivers/librte_power_acpi.a.p/meson-generated_.._rte_power_acpi.pmd.c.o 00:02:11.233 [485/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:11.233 [486/766] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:11.233 [487/766] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:11.233 [488/766] Linking static target drivers/librte_power_acpi.a 00:02:11.233 [489/766] Compiling C object drivers/librte_power_amd_pstate.a.p/meson-generated_.._rte_power_amd_pstate.pmd.c.o 00:02:11.233 [490/766] Generating drivers/rte_power_intel_uncore.pmd.c with a custom command 00:02:11.233 [491/766] Linking static target lib/librte_port.a 00:02:11.233 [492/766] Compiling C object drivers/librte_power_amd_pstate.so.25.0.p/meson-generated_.._rte_power_amd_pstate.pmd.c.o 00:02:11.497 [493/766] Linking static target drivers/librte_power_amd_pstate.a 00:02:11.497 [494/766] Compiling C object drivers/librte_power_intel_uncore.a.p/meson-generated_.._rte_power_intel_uncore.pmd.c.o 00:02:11.497 [495/766] Compiling C object drivers/librte_power_intel_uncore.so.25.0.p/meson-generated_.._rte_power_intel_uncore.pmd.c.o 00:02:11.498 [496/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:11.498 [497/766] Linking static target drivers/librte_power_intel_uncore.a 00:02:11.498 [498/766] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:11.498 [499/766] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:11.498 [500/766] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:11.498 [501/766] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:11.498 [502/766] Linking static target lib/librte_node.a 00:02:11.498 [503/766] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:11.498 [504/766] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:11.498 [505/766] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:11.498 [506/766] Generating drivers/rte_power_intel_pstate.pmd.c with a custom command 00:02:11.498 [507/766] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:11.498 [508/766] Compiling C object app/dpdk-graph.p/graph_l2fwd.c.o 00:02:11.498 [509/766] Compiling C object drivers/librte_power_intel_pstate.a.p/meson-generated_.._rte_power_intel_pstate.pmd.c.o 00:02:11.498 [510/766] Compiling C object drivers/librte_power_intel_pstate.so.25.0.p/meson-generated_.._rte_power_intel_pstate.pmd.c.o 00:02:11.498 [511/766] Linking static target drivers/librte_power_intel_pstate.a 00:02:11.498 [512/766] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:11.498 [513/766] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:11.498 [514/766] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.498 [515/766] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:11.767 [516/766] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.767 [517/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:11.767 [518/766] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.767 [519/766] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:11.767 [520/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:11.767 [521/766] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.767 [522/766] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:11.767 [523/766] Compiling C object drivers/librte_mempool_ring.so.25.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:11.767 [524/766] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:11.767 [525/766] Linking static target drivers/librte_mempool_ring.a 00:02:11.767 [526/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:11.767 [527/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:11.767 [528/766] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:11.767 [529/766] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.032 [530/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:12.032 [531/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:12.032 [532/766] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.032 [533/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:12.032 [534/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:12.032 [535/766] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.032 [536/766] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:12.032 [537/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:12.032 [538/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:12.032 [539/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:12.032 [540/766] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:12.032 [541/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:12.032 [542/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:12.032 [543/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:12.293 [544/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:12.293 [545/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:12.293 [546/766] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:12.293 [547/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:12.293 [548/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:12.293 [549/766] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:12.293 [550/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:12.293 [551/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:12.293 [552/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:12.293 [553/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:12.293 [554/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:12.293 [555/766] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:12.293 [556/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:12.293 [557/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:12.293 [558/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:12.293 [559/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:12.293 [560/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:12.294 [561/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:12.294 [562/766] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:12.294 [563/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:12.294 [564/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:12.553 [565/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:12.553 [566/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:12.553 [567/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:12.553 [568/766] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:12.553 [569/766] Linking static target lib/librte_ethdev.a 00:02:12.553 [570/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:12.553 [571/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:12.553 [572/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:12.553 [573/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:12.553 [574/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:12.553 [575/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:12.553 [576/766] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:12.553 [577/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:12.553 [578/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:12.553 [579/766] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:12.553 [580/766] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:12.813 [581/766] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:12.813 [582/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:12.813 [583/766] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:12.813 [584/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:12.813 [585/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:12.813 [586/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:12.813 [587/766] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:12.813 [588/766] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:12.813 [589/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:12.813 [590/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:12.813 [591/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:13.071 [592/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:13.071 [593/766] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:13.071 [594/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:13.071 [595/766] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:13.071 [596/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:13.071 [597/766] Compiling C object app/dpdk-testpmd.p/test-pmd_hairpin.c.o 00:02:13.071 [598/766] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:13.071 [599/766] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:13.071 [600/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:13.071 [601/766] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:13.071 [602/766] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:13.071 [603/766] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:13.071 [604/766] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:13.071 [605/766] Compiling C object app/dpdk-test-security-perf.p/test_test_security_proto.c.o 00:02:13.071 [606/766] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:13.071 [607/766] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:13.071 [608/766] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:13.071 [609/766] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:13.330 [610/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:13.330 [611/766] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:13.330 [612/766] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:13.330 [613/766] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:13.331 [614/766] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:13.331 [615/766] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:13.331 [616/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:13.331 [617/766] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:13.331 [618/766] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:13.331 [619/766] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:13.588 [620/766] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:13.588 [621/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:13.588 [622/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:13.588 [623/766] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:13.846 [624/766] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:13.846 [625/766] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:13.846 [626/766] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:14.104 [627/766] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:14.104 [628/766] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:14.362 [629/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:14.619 [630/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:14.619 [631/766] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:15.193 [632/766] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:15.193 [633/766] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:15.193 [634/766] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:15.193 [635/766] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:15.451 [636/766] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:15.451 [637/766] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:15.451 [638/766] Compiling C object drivers/librte_net_i40e.so.25.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:15.451 [639/766] Linking static target drivers/librte_net_i40e.a 00:02:15.709 [640/766] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:15.967 [641/766] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:16.533 [642/766] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:16.533 [643/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:17.097 [644/766] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:18.469 [645/766] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:18.469 [646/766] Linking target lib/librte_eal.so.25.0 00:02:18.469 [647/766] Generating symbol file lib/librte_eal.so.25.0.p/librte_eal.so.25.0.symbols 00:02:18.469 [648/766] Linking target drivers/librte_bus_vdev.so.25.0 00:02:18.469 [649/766] Linking target lib/librte_cfgfile.so.25.0 00:02:18.469 [650/766] Linking target lib/librte_ring.so.25.0 00:02:18.469 [651/766] Linking target lib/librte_acl.so.25.0 00:02:18.469 [652/766] Linking target lib/librte_dmadev.so.25.0 00:02:18.727 [653/766] Linking target lib/librte_timer.so.25.0 00:02:18.727 [654/766] Linking target lib/librte_jobstats.so.25.0 00:02:18.727 [655/766] Linking target lib/librte_pci.so.25.0 00:02:18.727 [656/766] Linking target lib/librte_stack.so.25.0 00:02:18.727 [657/766] Linking target lib/librte_meter.so.25.0 00:02:18.727 [658/766] Linking target lib/librte_rawdev.so.25.0 00:02:18.727 [659/766] Generating symbol file lib/librte_ring.so.25.0.p/librte_ring.so.25.0.symbols 00:02:18.727 [660/766] Generating symbol file lib/librte_acl.so.25.0.p/librte_acl.so.25.0.symbols 00:02:18.727 [661/766] Generating symbol file lib/librte_pci.so.25.0.p/librte_pci.so.25.0.symbols 00:02:18.727 [662/766] Generating symbol file lib/librte_timer.so.25.0.p/librte_timer.so.25.0.symbols 00:02:18.727 [663/766] Generating symbol file drivers/librte_bus_vdev.so.25.0.p/librte_bus_vdev.so.25.0.symbols 00:02:18.727 [664/766] Generating symbol file lib/librte_dmadev.so.25.0.p/librte_dmadev.so.25.0.symbols 00:02:18.727 [665/766] Generating symbol file lib/librte_meter.so.25.0.p/librte_meter.so.25.0.symbols 00:02:18.727 [666/766] Linking target lib/librte_rcu.so.25.0 00:02:18.727 [667/766] Linking target lib/librte_mempool.so.25.0 00:02:18.727 [668/766] Linking target drivers/librte_bus_pci.so.25.0 00:02:18.985 [669/766] Generating symbol file lib/librte_mempool.so.25.0.p/librte_mempool.so.25.0.symbols 00:02:18.985 [670/766] Generating symbol file lib/librte_rcu.so.25.0.p/librte_rcu.so.25.0.symbols 00:02:18.985 [671/766] Generating symbol file drivers/librte_bus_pci.so.25.0.p/librte_bus_pci.so.25.0.symbols 00:02:18.985 [672/766] Linking target drivers/librte_mempool_ring.so.25.0 00:02:18.985 [673/766] Linking target lib/librte_mbuf.so.25.0 00:02:18.985 [674/766] Generating symbol file lib/librte_mbuf.so.25.0.p/librte_mbuf.so.25.0.symbols 00:02:19.242 [675/766] Linking target lib/librte_reorder.so.25.0 00:02:19.242 [676/766] Linking target lib/librte_bbdev.so.25.0 00:02:19.242 [677/766] Linking target lib/librte_cryptodev.so.25.0 00:02:19.242 [678/766] Linking target lib/librte_net.so.25.0 00:02:19.242 [679/766] Linking target lib/librte_compressdev.so.25.0 00:02:19.242 [680/766] Linking target lib/librte_mldev.so.25.0 00:02:19.242 [681/766] Linking target lib/librte_sched.so.25.0 00:02:19.242 [682/766] Linking target lib/librte_regexdev.so.25.0 00:02:19.242 [683/766] Linking target lib/librte_gpudev.so.25.0 00:02:19.242 [684/766] Linking target lib/librte_distributor.so.25.0 00:02:19.242 [685/766] Generating symbol file lib/librte_cryptodev.so.25.0.p/librte_cryptodev.so.25.0.symbols 00:02:19.242 [686/766] Generating symbol file lib/librte_reorder.so.25.0.p/librte_reorder.so.25.0.symbols 00:02:19.242 [687/766] Generating symbol file lib/librte_net.so.25.0.p/librte_net.so.25.0.symbols 00:02:19.242 [688/766] Generating symbol file lib/librte_sched.so.25.0.p/librte_sched.so.25.0.symbols 00:02:19.242 [689/766] Linking target lib/librte_security.so.25.0 00:02:19.242 [690/766] Linking target lib/librte_rib.so.25.0 00:02:19.242 [691/766] Linking target lib/librte_cmdline.so.25.0 00:02:19.242 [692/766] Linking target lib/librte_hash.so.25.0 00:02:19.500 [693/766] Generating symbol file lib/librte_security.so.25.0.p/librte_security.so.25.0.symbols 00:02:19.500 [694/766] Generating symbol file lib/librte_rib.so.25.0.p/librte_rib.so.25.0.symbols 00:02:19.500 [695/766] Generating symbol file lib/librte_hash.so.25.0.p/librte_hash.so.25.0.symbols 00:02:19.500 [696/766] Linking target lib/librte_pdcp.so.25.0 00:02:19.500 [697/766] Linking target lib/librte_efd.so.25.0 00:02:19.500 [698/766] Linking target lib/librte_fib.so.25.0 00:02:19.500 [699/766] Linking target lib/librte_member.so.25.0 00:02:19.500 [700/766] Linking target lib/librte_lpm.so.25.0 00:02:19.500 [701/766] Linking target lib/librte_ipsec.so.25.0 00:02:19.758 [702/766] Generating symbol file lib/librte_lpm.so.25.0.p/librte_lpm.so.25.0.symbols 00:02:19.758 [703/766] Generating symbol file lib/librte_ipsec.so.25.0.p/librte_ipsec.so.25.0.symbols 00:02:21.657 [704/766] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.916 [705/766] Linking target lib/librte_ethdev.so.25.0 00:02:21.916 [706/766] Generating symbol file lib/librte_ethdev.so.25.0.p/librte_ethdev.so.25.0.symbols 00:02:21.916 [707/766] Linking target lib/librte_bpf.so.25.0 00:02:21.916 [708/766] Linking target lib/librte_gso.so.25.0 00:02:21.916 [709/766] Linking target lib/librte_pcapng.so.25.0 00:02:21.916 [710/766] Linking target lib/librte_ip_frag.so.25.0 00:02:21.916 [711/766] Linking target lib/librte_gro.so.25.0 00:02:21.916 [712/766] Linking target lib/librte_metrics.so.25.0 00:02:21.916 [713/766] Linking target lib/librte_power.so.25.0 00:02:22.174 [714/766] Linking target lib/librte_eventdev.so.25.0 00:02:22.174 [715/766] Linking target drivers/librte_net_i40e.so.25.0 00:02:22.174 [716/766] Generating symbol file lib/librte_bpf.so.25.0.p/librte_bpf.so.25.0.symbols 00:02:22.174 [717/766] Generating symbol file lib/librte_metrics.so.25.0.p/librte_metrics.so.25.0.symbols 00:02:22.174 [718/766] Generating symbol file lib/librte_ip_frag.so.25.0.p/librte_ip_frag.so.25.0.symbols 00:02:22.174 [719/766] Generating symbol file lib/librte_power.so.25.0.p/librte_power.so.25.0.symbols 00:02:22.174 [720/766] Generating symbol file lib/librte_pcapng.so.25.0.p/librte_pcapng.so.25.0.symbols 00:02:22.174 [721/766] Generating symbol file lib/librte_eventdev.so.25.0.p/librte_eventdev.so.25.0.symbols 00:02:22.174 [722/766] Linking target drivers/librte_power_acpi.so.25.0 00:02:22.174 [723/766] Linking target lib/librte_latencystats.so.25.0 00:02:22.174 [724/766] Linking target drivers/librte_power_kvm_vm.so.25.0 00:02:22.174 [725/766] Linking target lib/librte_bitratestats.so.25.0 00:02:22.174 [726/766] Linking target drivers/librte_power_intel_pstate.so.25.0 00:02:22.175 [727/766] Linking target drivers/librte_power_intel_uncore.so.25.0 00:02:22.175 [728/766] Linking target drivers/librte_power_amd_pstate.so.25.0 00:02:22.175 [729/766] Linking target lib/librte_graph.so.25.0 00:02:22.175 [730/766] Linking target drivers/librte_power_cppc.so.25.0 00:02:22.175 [731/766] Linking target lib/librte_pdump.so.25.0 00:02:22.175 [732/766] Linking target lib/librte_dispatcher.so.25.0 00:02:22.175 [733/766] Linking target lib/librte_port.so.25.0 00:02:22.433 [734/766] Generating symbol file lib/librte_graph.so.25.0.p/librte_graph.so.25.0.symbols 00:02:22.433 [735/766] Linking target lib/librte_node.so.25.0 00:02:22.433 [736/766] Generating symbol file lib/librte_port.so.25.0.p/librte_port.so.25.0.symbols 00:02:22.433 [737/766] Linking target lib/librte_table.so.25.0 00:02:22.693 [738/766] Generating symbol file lib/librte_table.so.25.0.p/librte_table.so.25.0.symbols 00:02:23.261 [739/766] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:23.261 [740/766] Linking static target lib/librte_pipeline.a 00:02:24.197 [741/766] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:24.198 [742/766] Linking static target lib/librte_vhost.a 00:02:24.764 [743/766] Linking target app/dpdk-test-fib 00:02:24.764 [744/766] Linking target app/dpdk-proc-info 00:02:24.764 [745/766] Linking target app/dpdk-dumpcap 00:02:24.764 [746/766] Linking target app/dpdk-test-acl 00:02:24.764 [747/766] Linking target app/dpdk-test-flow-perf 00:02:24.764 [748/766] Linking target app/dpdk-test-compress-perf 00:02:24.764 [749/766] Linking target app/dpdk-pdump 00:02:24.764 [750/766] Linking target app/dpdk-test-regex 00:02:24.764 [751/766] Linking target app/dpdk-test-dma-perf 00:02:24.764 [752/766] Linking target app/dpdk-test-cmdline 00:02:24.764 [753/766] Linking target app/dpdk-test-bbdev 00:02:24.764 [754/766] Linking target app/dpdk-test-pipeline 00:02:24.764 [755/766] Linking target app/dpdk-test-mldev 00:02:24.764 [756/766] Linking target app/dpdk-test-sad 00:02:24.764 [757/766] Linking target app/dpdk-test-security-perf 00:02:24.764 [758/766] Linking target app/dpdk-test-gpudev 00:02:24.764 [759/766] Linking target app/dpdk-test-crypto-perf 00:02:24.764 [760/766] Linking target app/dpdk-graph 00:02:24.764 [761/766] Linking target app/dpdk-test-eventdev 00:02:24.764 [762/766] Linking target app/dpdk-testpmd 00:02:26.664 [763/766] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.664 [764/766] Linking target lib/librte_vhost.so.25.0 00:02:29.194 [765/766] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.194 [766/766] Linking target lib/librte_pipeline.so.25.0 00:02:29.194 12:32:59 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:02:29.194 12:32:59 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:29.194 12:32:59 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp -j72 install 00:02:29.194 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp' 00:02:29.194 [0/1] Installing files. 00:02:29.458 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/memory.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/cpu.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/telemetry-endpoints/counters.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/telemetry-endpoints 00:02:29.458 Installing subdir /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/packet_ordering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/packet_ordering 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/basicfwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/skeleton/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/skeleton 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/helloworld/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/helloworld 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/obj.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/hash_func.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipv6_addr_swap.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/varbit.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec_sa.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ethdev.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/learner.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/mirroring.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/ipsec.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/recirculation.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/meter.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.458 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_routing_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/registers.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.spec to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/pcap.io to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/packet.txt to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/pipeline/examples/selector.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_xts.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_cmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_tdes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_eddsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_hmac.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_ccm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_aes.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_rsa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_sha.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_validation_gcm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/fips_dev_self_test.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/fips_validation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/fips_validation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-jobstats/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/service_cores/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/service_cores 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/virtio_net.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/pkt_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/sse/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/sse 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/altivec/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/altivec 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/common/neon/port_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/common/neon 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-macsec/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/flow_skeleton.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_gre.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_mpls.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_ipv4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/flow_filtering/snippets/snippet_match_ipv4.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/flow_filtering/snippets 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_fragmentation/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_monitor.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.459 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/channel_manager.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/power_manager.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/vm_power_cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vm_power_manager/guest_cli/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/perf_core.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-power/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/rxtx_callbacks/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bond/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bond 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/link_status_interrupt/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bbdev_app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bbdev_app 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_poll.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t2.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/README to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t1.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/dummy.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/bpf/t3.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/bpf 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/vhost_blk_compat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/blk_spec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vhost_blk/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vhost_blk 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/node.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/server_node_efd/efd_node/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/ptpclient.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ptpclient/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ptpclient 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_reassembly/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/parse_obj_list.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.460 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/cmdline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/cmdline 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/vdpa_blk_compact.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vdpa/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vdpa 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/dmafwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/dma/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/dma 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/rte_policer.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_meter/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_meter 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/cat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/timer/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/timer 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/stats.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_red.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_pie.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/profile_ov.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cmdline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/cfg_file.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/qos_sched/app_thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/qos_sched 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipv4_multicast/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_route.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/lpm_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v4.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_fib.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_route_parse.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/em_default_v6.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_event_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sa.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_worker.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.461 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_process.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp4.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep1.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/rt.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/event_helper.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ep0.cfg to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/sp6.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/run_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/linux_test.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/pkttest.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/load_env.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/shm.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:29.462 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/conn.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cryptodev.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/parser.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/swq.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tmgr.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/tap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/link.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/pipeline.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/cli.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/thread.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/firewall.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/route.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/tap.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ip_pipeline/examples/rss.cli to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/vmdq_dcb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/ntb_fwd.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ntb/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ntb 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/ethapp.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/ethtool-app/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/rte_ethtool.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/ethtool/lib/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/mp_commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/simple_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/commands.list to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/hotplug_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.463 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/client_server_mp/shared/common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/multi_process/symmetric_mp/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/eventdev_pipeline/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/distributor/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/distributor 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l3fwd-graph/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/main.c to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:29.464 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/examples/l2fwd-crypto/Makefile to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:29.464 Installing lib/librte_log.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_kvargs.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_argparse.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_telemetry.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_eal.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_rcu.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_mempool.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_mbuf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_net.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_meter.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ethdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cmdline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_metrics.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_hash.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_timer.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_acl.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bbdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bitratestats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bpf.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cfgfile.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_compressdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cryptodev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_distributor.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_dmadev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_efd.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_eventdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_dispatcher.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gpudev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gro.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gso.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ip_frag.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_jobstats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_latencystats.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_lpm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_member.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_pcapng.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_power.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_rawdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_regexdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.464 Installing lib/librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_mldev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_rib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_reorder.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_sched.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_security.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_stack.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_vhost.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_ipsec.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pdcp.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_fib.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_port.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pdump.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_table.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pipeline.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_graph.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_node.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing lib/librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing drivers/librte_bus_pci.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing drivers/librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.465 Installing drivers/librte_bus_vdev.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing drivers/librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.465 Installing drivers/librte_mempool_ring.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.465 Installing drivers/librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_net_i40e.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_acpi.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_acpi.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_amd_pstate.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_amd_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_cppc.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_cppc.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_intel_pstate.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_intel_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_intel_uncore.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_intel_uncore.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing drivers/librte_power_kvm_vm.a to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:29.727 Installing drivers/librte_power_kvm_vm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0 00:02:29.727 Installing app/dpdk-dumpcap to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-graph to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-pdump to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-proc-info to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-acl to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-bbdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-cmdline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-compress-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-crypto-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-dma-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-eventdev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-fib to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-flow-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-gpudev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-mldev to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-pipeline to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-testpmd to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-regex to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-sad to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing app/dpdk-test-security-perf to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/rte_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/log/rte_log.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/kvargs/rte_kvargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/argparse/rte_argparse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/telemetry/rte_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.727 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/generic/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include/generic 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cpuflags.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_cycles.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_io.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_memcpy.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_pause.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_prefetch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rtm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_rwlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_spinlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_vect.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_atomic_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_alarm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitmap.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bitset.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_branch_prediction.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_bus.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_class.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_compat.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_debug.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_dev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_devargs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_memconfig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_eal_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_errno.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_epoll.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_fbarray.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hexdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_hypervisor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_interrupts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_keepalive.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_launch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lcore_var.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_lock_annotations.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_malloc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_mcslock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memory.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_memzone.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pci_dev_features.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_per_lcore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_pflock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_random.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_reciprocal.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqcount.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_seqlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_service_component.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_stdatomic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_string_fns.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_tailq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_thread.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_ticketlock.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_time.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_trace_point_register.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_uuid.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_version.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/include/rte_vfio.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eal/linux/include/rte_os.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ptr_compress/rte_ptr_compress.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.728 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_c11_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_generic_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_peek_zc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rcu/rte_rcu_qsbr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mempool/rte_mempool_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_ptype.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mbuf/rte_mbuf_dyn.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_cksum.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip4.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ip6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_udp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_tls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_dtls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_esp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_sctp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_icmp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_arp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ether.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_macsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_vxlan.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gre.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_gtp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_net_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_mpls.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_higig.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ecpri.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_pdcp_hdr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_geneve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_l2tpv2.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ppp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/net/rte_ib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/meter/rte_meter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_cman.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_dev_info.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_flow_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_mtr_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_tm_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_ethdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ethdev/rte_eth_ctrl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pci/rte_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_num.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_string.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_rdline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_vt100.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_socket.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_cirbuf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cmdline/cmdline_parse_portlist.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/metrics/rte_metrics_telemetry.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_fbk_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash_crc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_jhash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_sw.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_crc_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.729 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/hash/rte_thash_x86_gfni.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/timer/rte_timer.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/acl/rte_acl_osdep.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bbdev/rte_bbdev_op.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bitratestats/rte_bitrate.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/bpf_def.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/bpf/rte_bpf_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cfgfile/rte_cfgfile.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_compressdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/compressdev/rte_comp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_sym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_crypto_asym.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/cryptodev/rte_cryptodev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/distributor/rte_distributor.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dmadev/rte_dmadev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/efd/rte_efd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_dma_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_event_timer_adapter.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/eventdev/rte_eventdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/dispatcher/rte_dispatcher.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gpudev/rte_gpudev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gro/rte_gro.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/gso/rte_gso.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ip_frag/rte_ip_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/jobstats/rte_jobstats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/latencystats/rte_latencystats.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_altivec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_neon.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_scalar.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sse.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/lpm/rte_lpm_sve.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/member/rte_member.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pcapng/rte_pcapng.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/power_cpufreq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/power_uncore_ops.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_cpufreq.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_pmd_mgmt.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_qos.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/power/rte_power_uncore.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rawdev/rte_rawdev_pmd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/regexdev/rte_regexdev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/mldev/rte_mldev_core.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/rib/rte_rib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/reorder/rte_reorder.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_approx.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_red.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_sched_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/sched/rte_pie.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.730 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/security/rte_security_driver.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_std.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_generic.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_c11.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/stack/rte_stack_lf_stubs.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vdpa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_async.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/vhost/rte_vhost_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sa.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_sad.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/ipsec/rte_ipsec_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdcp/rte_pdcp_group.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/fib/rte_fib6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_frag.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ras.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sched.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_sym_crypto.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_port_eventdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ethdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_fd.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_ring.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/port/rte_swx_port_source_sink.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pdump/rte_pdump.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_em.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_learner.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_selector.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_swx_table_wm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_acl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_array.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_cuckoo.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_lpm_ipv6.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_stub.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_lru_x86.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/table/rte_table_hash_func_arm64.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_port_in_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_table_action.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ipsec.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_pipeline.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_extern.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/pipeline/rte_swx_ctl.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_model_rtc.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/graph/rte_graph_worker_common.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_eth_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip4_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_ip6_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/lib/node/rte_node_udp4_input_api.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/pci/rte_bus_pci.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/drivers/power/kvm_vm/rte_power_guest_channel.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/buildtools/dpdk-cmdline-gen.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-devbind.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-pmdinfo.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-hugepages.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-rss-flows.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.731 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/usertools/dpdk-telemetry-exporter.py to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/bin 00:02:29.732 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/rte_build_config.h to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:29.732 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:29.732 Installing /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build-tmp/meson-private/libdpdk.pc to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig 00:02:29.732 Installing symlink pointing to librte_log.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so.25 00:02:29.732 Installing symlink pointing to librte_log.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_log.so 00:02:29.732 Installing symlink pointing to librte_kvargs.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so.25 00:02:29.732 Installing symlink pointing to librte_kvargs.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_kvargs.so 00:02:29.732 Installing symlink pointing to librte_argparse.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so.25 00:02:29.732 Installing symlink pointing to librte_argparse.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_argparse.so 00:02:29.732 Installing symlink pointing to librte_telemetry.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so.25 00:02:29.732 Installing symlink pointing to librte_telemetry.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_telemetry.so 00:02:29.732 Installing symlink pointing to librte_eal.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so.25 00:02:29.732 Installing symlink pointing to librte_eal.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eal.so 00:02:29.732 Installing symlink pointing to librte_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so.25 00:02:29.732 Installing symlink pointing to librte_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ring.so 00:02:29.732 Installing symlink pointing to librte_rcu.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so.25 00:02:29.732 Installing symlink pointing to librte_rcu.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rcu.so 00:02:29.732 Installing symlink pointing to librte_mempool.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so.25 00:02:29.732 Installing symlink pointing to librte_mempool.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mempool.so 00:02:29.732 Installing symlink pointing to librte_mbuf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so.25 00:02:29.732 Installing symlink pointing to librte_mbuf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mbuf.so 00:02:29.732 Installing symlink pointing to librte_net.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so.25 00:02:29.732 Installing symlink pointing to librte_net.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_net.so 00:02:29.732 Installing symlink pointing to librte_meter.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so.25 00:02:29.732 Installing symlink pointing to librte_meter.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_meter.so 00:02:29.732 Installing symlink pointing to librte_ethdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so.25 00:02:29.732 Installing symlink pointing to librte_ethdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ethdev.so 00:02:29.732 Installing symlink pointing to librte_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so.25 00:02:29.732 Installing symlink pointing to librte_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pci.so 00:02:29.732 Installing symlink pointing to librte_cmdline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so.25 00:02:29.732 Installing symlink pointing to librte_cmdline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cmdline.so 00:02:29.732 Installing symlink pointing to librte_metrics.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so.25 00:02:29.732 Installing symlink pointing to librte_metrics.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_metrics.so 00:02:29.732 Installing symlink pointing to librte_hash.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so.25 00:02:29.732 Installing symlink pointing to librte_hash.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_hash.so 00:02:29.732 Installing symlink pointing to librte_timer.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so.25 00:02:29.732 Installing symlink pointing to librte_timer.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_timer.so 00:02:29.732 Installing symlink pointing to librte_acl.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so.25 00:02:29.732 Installing symlink pointing to librte_acl.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_acl.so 00:02:29.732 Installing symlink pointing to librte_bbdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so.25 00:02:29.732 Installing symlink pointing to librte_bbdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bbdev.so 00:02:29.732 Installing symlink pointing to librte_bitratestats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so.25 00:02:29.732 Installing symlink pointing to librte_bitratestats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bitratestats.so 00:02:29.732 Installing symlink pointing to librte_bpf.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so.25 00:02:29.732 Installing symlink pointing to librte_bpf.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_bpf.so 00:02:29.732 Installing symlink pointing to librte_cfgfile.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so.25 00:02:29.732 Installing symlink pointing to librte_cfgfile.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cfgfile.so 00:02:29.732 Installing symlink pointing to librte_compressdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so.25 00:02:29.732 Installing symlink pointing to librte_compressdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_compressdev.so 00:02:29.732 Installing symlink pointing to librte_cryptodev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so.25 00:02:29.732 Installing symlink pointing to librte_cryptodev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_cryptodev.so 00:02:29.732 Installing symlink pointing to librte_distributor.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so.25 00:02:29.732 Installing symlink pointing to librte_distributor.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_distributor.so 00:02:29.732 Installing symlink pointing to librte_dmadev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so.25 00:02:29.732 Installing symlink pointing to librte_dmadev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dmadev.so 00:02:29.732 Installing symlink pointing to librte_efd.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so.25 00:02:29.732 Installing symlink pointing to librte_efd.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_efd.so 00:02:29.732 Installing symlink pointing to librte_eventdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so.25 00:02:29.732 Installing symlink pointing to librte_eventdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_eventdev.so 00:02:29.732 Installing symlink pointing to librte_dispatcher.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so.25 00:02:29.732 Installing symlink pointing to librte_dispatcher.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_dispatcher.so 00:02:29.732 Installing symlink pointing to librte_gpudev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so.25 00:02:29.732 Installing symlink pointing to librte_gpudev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gpudev.so 00:02:29.732 Installing symlink pointing to librte_gro.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so.25 00:02:29.732 Installing symlink pointing to librte_gro.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gro.so 00:02:29.732 Installing symlink pointing to librte_gso.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so.25 00:02:29.732 Installing symlink pointing to librte_gso.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_gso.so 00:02:29.732 Installing symlink pointing to librte_ip_frag.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so.25 00:02:29.732 Installing symlink pointing to librte_ip_frag.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ip_frag.so 00:02:29.732 Installing symlink pointing to librte_jobstats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so.25 00:02:29.732 Installing symlink pointing to librte_jobstats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_jobstats.so 00:02:29.732 Installing symlink pointing to librte_latencystats.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so.25 00:02:29.732 Installing symlink pointing to librte_latencystats.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_latencystats.so 00:02:29.732 Installing symlink pointing to librte_lpm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so.25 00:02:29.732 Installing symlink pointing to librte_lpm.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_lpm.so 00:02:29.732 Installing symlink pointing to librte_member.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so.25 00:02:29.732 Installing symlink pointing to librte_member.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_member.so 00:02:29.732 Installing symlink pointing to librte_pcapng.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so.25 00:02:29.732 Installing symlink pointing to librte_pcapng.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pcapng.so 00:02:29.732 Installing symlink pointing to librte_power.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so.25 00:02:29.732 Installing symlink pointing to librte_power.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_power.so 00:02:29.733 Installing symlink pointing to librte_rawdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so.25 00:02:29.733 Installing symlink pointing to librte_rawdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rawdev.so 00:02:29.733 Installing symlink pointing to librte_regexdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so.25 00:02:29.733 Installing symlink pointing to librte_regexdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_regexdev.so 00:02:29.733 Installing symlink pointing to librte_mldev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so.25 00:02:29.733 Installing symlink pointing to librte_mldev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_mldev.so 00:02:29.733 Installing symlink pointing to librte_rib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so.25 00:02:29.733 Installing symlink pointing to librte_rib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_rib.so 00:02:29.733 Installing symlink pointing to librte_reorder.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so.25 00:02:29.733 Installing symlink pointing to librte_reorder.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_reorder.so 00:02:29.733 Installing symlink pointing to librte_sched.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so.25 00:02:29.733 Installing symlink pointing to librte_sched.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_sched.so 00:02:29.733 Installing symlink pointing to librte_security.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so.25 00:02:29.733 Installing symlink pointing to librte_security.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_security.so 00:02:29.733 Installing symlink pointing to librte_stack.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so.25 00:02:29.733 Installing symlink pointing to librte_stack.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_stack.so 00:02:29.733 Installing symlink pointing to librte_vhost.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so.25 00:02:29.733 Installing symlink pointing to librte_vhost.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_vhost.so 00:02:29.733 Installing symlink pointing to librte_ipsec.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so.25 00:02:29.733 Installing symlink pointing to librte_ipsec.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_ipsec.so 00:02:29.733 Installing symlink pointing to librte_pdcp.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so.25 00:02:29.733 Installing symlink pointing to librte_pdcp.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdcp.so 00:02:29.733 Installing symlink pointing to librte_fib.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so.25 00:02:29.733 Installing symlink pointing to librte_fib.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_fib.so 00:02:29.733 Installing symlink pointing to librte_port.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so.25 00:02:29.733 Installing symlink pointing to librte_port.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_port.so 00:02:29.733 Installing symlink pointing to librte_pdump.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so.25 00:02:29.733 Installing symlink pointing to librte_pdump.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pdump.so 00:02:29.733 Installing symlink pointing to librte_table.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so.25 00:02:29.733 Installing symlink pointing to librte_table.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_table.so 00:02:29.733 Installing symlink pointing to librte_pipeline.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so.25 00:02:29.733 Installing symlink pointing to librte_pipeline.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_pipeline.so 00:02:29.733 Installing symlink pointing to librte_graph.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so.25 00:02:29.733 Installing symlink pointing to librte_graph.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_graph.so 00:02:29.733 Installing symlink pointing to librte_node.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so.25 00:02:29.733 Installing symlink pointing to librte_node.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/librte_node.so 00:02:29.733 Installing symlink pointing to librte_bus_pci.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so.25 00:02:29.733 Installing symlink pointing to librte_bus_pci.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_pci.so 00:02:29.733 Installing symlink pointing to librte_bus_vdev.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so.25 00:02:29.733 Installing symlink pointing to librte_bus_vdev.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_bus_vdev.so 00:02:29.733 Installing symlink pointing to librte_mempool_ring.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so.25 00:02:29.733 Installing symlink pointing to librte_mempool_ring.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_mempool_ring.so 00:02:29.733 Installing symlink pointing to librte_net_i40e.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so.25 00:02:29.733 Installing symlink pointing to librte_net_i40e.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_net_i40e.so 00:02:29.733 Installing symlink pointing to librte_power_acpi.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_acpi.so.25 00:02:29.733 Installing symlink pointing to librte_power_acpi.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_acpi.so 00:02:29.733 './librte_bus_pci.so' -> 'dpdk/pmds-25.0/librte_bus_pci.so' 00:02:29.733 './librte_bus_pci.so.25' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25' 00:02:29.733 './librte_bus_pci.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_pci.so.25.0' 00:02:29.733 './librte_bus_vdev.so' -> 'dpdk/pmds-25.0/librte_bus_vdev.so' 00:02:29.733 './librte_bus_vdev.so.25' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25' 00:02:29.733 './librte_bus_vdev.so.25.0' -> 'dpdk/pmds-25.0/librte_bus_vdev.so.25.0' 00:02:29.733 './librte_mempool_ring.so' -> 'dpdk/pmds-25.0/librte_mempool_ring.so' 00:02:29.733 './librte_mempool_ring.so.25' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25' 00:02:29.733 './librte_mempool_ring.so.25.0' -> 'dpdk/pmds-25.0/librte_mempool_ring.so.25.0' 00:02:29.733 './librte_net_i40e.so' -> 'dpdk/pmds-25.0/librte_net_i40e.so' 00:02:29.733 './librte_net_i40e.so.25' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25' 00:02:29.733 './librte_net_i40e.so.25.0' -> 'dpdk/pmds-25.0/librte_net_i40e.so.25.0' 00:02:29.733 './librte_power_acpi.so' -> 'dpdk/pmds-25.0/librte_power_acpi.so' 00:02:29.733 './librte_power_acpi.so.25' -> 'dpdk/pmds-25.0/librte_power_acpi.so.25' 00:02:29.733 './librte_power_acpi.so.25.0' -> 'dpdk/pmds-25.0/librte_power_acpi.so.25.0' 00:02:29.733 './librte_power_amd_pstate.so' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so' 00:02:29.733 './librte_power_amd_pstate.so.25' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so.25' 00:02:29.733 './librte_power_amd_pstate.so.25.0' -> 'dpdk/pmds-25.0/librte_power_amd_pstate.so.25.0' 00:02:29.733 './librte_power_cppc.so' -> 'dpdk/pmds-25.0/librte_power_cppc.so' 00:02:29.733 './librte_power_cppc.so.25' -> 'dpdk/pmds-25.0/librte_power_cppc.so.25' 00:02:29.733 './librte_power_cppc.so.25.0' -> 'dpdk/pmds-25.0/librte_power_cppc.so.25.0' 00:02:29.733 './librte_power_intel_pstate.so' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so' 00:02:29.733 './librte_power_intel_pstate.so.25' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so.25' 00:02:29.733 './librte_power_intel_pstate.so.25.0' -> 'dpdk/pmds-25.0/librte_power_intel_pstate.so.25.0' 00:02:29.733 './librte_power_intel_uncore.so' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so' 00:02:29.733 './librte_power_intel_uncore.so.25' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so.25' 00:02:29.733 './librte_power_intel_uncore.so.25.0' -> 'dpdk/pmds-25.0/librte_power_intel_uncore.so.25.0' 00:02:29.733 './librte_power_kvm_vm.so' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so' 00:02:29.733 './librte_power_kvm_vm.so.25' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so.25' 00:02:29.733 './librte_power_kvm_vm.so.25.0' -> 'dpdk/pmds-25.0/librte_power_kvm_vm.so.25.0' 00:02:29.733 Installing symlink pointing to librte_power_amd_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_amd_pstate.so.25 00:02:29.733 Installing symlink pointing to librte_power_amd_pstate.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_amd_pstate.so 00:02:29.733 Installing symlink pointing to librte_power_cppc.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_cppc.so.25 00:02:29.733 Installing symlink pointing to librte_power_cppc.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_cppc.so 00:02:29.733 Installing symlink pointing to librte_power_intel_pstate.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_pstate.so.25 00:02:29.733 Installing symlink pointing to librte_power_intel_pstate.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_pstate.so 00:02:29.733 Installing symlink pointing to librte_power_intel_uncore.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_uncore.so.25 00:02:29.733 Installing symlink pointing to librte_power_intel_uncore.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_intel_uncore.so 00:02:29.733 Installing symlink pointing to librte_power_kvm_vm.so.25.0 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_kvm_vm.so.25 00:02:29.733 Installing symlink pointing to librte_power_kvm_vm.so.25 to /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/dpdk/pmds-25.0/librte_power_kvm_vm.so 00:02:29.733 Running custom install script '/bin/sh /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-25.0' 00:02:29.992 12:32:59 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:02:29.992 12:32:59 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:29.992 00:02:29.992 real 0m30.964s 00:02:29.992 user 9m11.039s 00:02:29.992 sys 2m7.903s 00:02:29.992 12:32:59 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:29.992 12:32:59 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:29.992 ************************************ 00:02:29.992 END TEST build_native_dpdk 00:02:29.992 ************************************ 00:02:29.992 12:32:59 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:29.992 12:32:59 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:29.992 12:32:59 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:02:29.992 12:32:59 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:02:29.992 12:32:59 -- common/autobuild_common.sh@445 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:02:29.992 12:32:59 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:29.992 12:32:59 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:29.992 12:32:59 -- common/autotest_common.sh@10 -- $ set +x 00:02:29.992 ************************************ 00:02:29.992 START TEST autobuild_llvm_precompile 00:02:29.992 ************************************ 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autotest_common.sh@1129 -- $ _llvm_precompile 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ clang --version 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:02:29.992 Target: x86_64-redhat-linux-gnu 00:02:29.992 Thread model: posix 00:02:29.992 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@33 -- $ clang_num=17 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:02:29.992 12:32:59 autobuild_llvm_precompile -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:30.251 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:30.251 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:30.251 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:30.509 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:30.768 Using 'verbs' RDMA provider 00:02:47.028 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:02:59.235 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:02:59.235 Creating mk/config.mk...done. 00:02:59.235 Creating mk/cc.flags.mk...done. 00:02:59.235 Type 'make' to build. 00:02:59.235 00:02:59.235 real 0m28.921s 00:02:59.235 user 0m12.950s 00:02:59.235 sys 0m15.212s 00:02:59.235 12:33:28 autobuild_llvm_precompile -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:59.235 12:33:28 autobuild_llvm_precompile -- common/autotest_common.sh@10 -- $ set +x 00:02:59.235 ************************************ 00:02:59.235 END TEST autobuild_llvm_precompile 00:02:59.235 ************************************ 00:02:59.235 12:33:28 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:59.235 12:33:28 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:59.235 12:33:28 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:59.235 12:33:28 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:02:59.235 12:33:28 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-dpdk=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:02:59.235 Using /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib/pkgconfig for additional libs... 00:02:59.235 DPDK libraries: /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:02:59.235 DPDK includes: //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:02:59.494 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:02:59.752 Using 'verbs' RDMA provider 00:03:13.321 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:03:23.291 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:03:23.291 Creating mk/config.mk...done. 00:03:23.291 Creating mk/cc.flags.mk...done. 00:03:23.291 Type 'make' to build. 00:03:23.291 12:33:53 -- spdk/autobuild.sh@70 -- $ run_test make make -j72 00:03:23.291 12:33:53 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:23.291 12:33:53 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:23.291 12:33:53 -- common/autotest_common.sh@10 -- $ set +x 00:03:23.291 ************************************ 00:03:23.291 START TEST make 00:03:23.291 ************************************ 00:03:23.291 12:33:53 make -- common/autotest_common.sh@1129 -- $ make -j72 00:03:23.551 make[1]: Nothing to be done for 'all'. 00:03:25.466 The Meson build system 00:03:25.466 Version: 1.5.0 00:03:25.466 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:03:25.466 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:25.466 Build type: native build 00:03:25.466 Project name: libvfio-user 00:03:25.466 Project version: 0.0.1 00:03:25.466 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:03:25.466 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:03:25.466 Host machine cpu family: x86_64 00:03:25.466 Host machine cpu: x86_64 00:03:25.466 Run-time dependency threads found: YES 00:03:25.466 Library dl found: YES 00:03:25.466 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:25.466 Run-time dependency json-c found: YES 0.17 00:03:25.466 Run-time dependency cmocka found: YES 1.1.7 00:03:25.466 Program pytest-3 found: NO 00:03:25.466 Program flake8 found: NO 00:03:25.466 Program misspell-fixer found: NO 00:03:25.466 Program restructuredtext-lint found: NO 00:03:25.466 Program valgrind found: YES (/usr/bin/valgrind) 00:03:25.466 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:03:25.466 Compiler for C supports arguments -Wmissing-declarations: YES 00:03:25.466 Compiler for C supports arguments -Wwrite-strings: YES 00:03:25.466 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:25.466 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:03:25.466 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:03:25.466 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:03:25.466 Build targets in project: 8 00:03:25.466 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:03:25.466 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:03:25.466 00:03:25.466 libvfio-user 0.0.1 00:03:25.466 00:03:25.466 User defined options 00:03:25.466 buildtype : debug 00:03:25.466 default_library: static 00:03:25.466 libdir : /usr/local/lib 00:03:25.466 00:03:25.466 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:25.466 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:25.725 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:03:25.725 [2/36] Compiling C object samples/null.p/null.c.o 00:03:25.725 [3/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:03:25.725 [4/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:03:25.725 [5/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:03:25.725 [6/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:03:25.725 [7/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:03:25.725 [8/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:03:25.725 [9/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:03:25.725 [10/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:03:25.725 [11/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:03:25.725 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:03:25.725 [13/36] Compiling C object test/unit_tests.p/mocks.c.o 00:03:25.725 [14/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:03:25.725 [15/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:03:25.725 [16/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:03:25.725 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:03:25.725 [18/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:03:25.725 [19/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:03:25.725 [20/36] Compiling C object samples/server.p/server.c.o 00:03:25.725 [21/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:03:25.725 [22/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:03:25.725 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:03:25.725 [24/36] Compiling C object samples/client.p/client.c.o 00:03:25.725 [25/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:03:25.725 [26/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:03:25.725 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:03:25.725 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:03:25.725 [29/36] Linking static target lib/libvfio-user.a 00:03:25.725 [30/36] Linking target samples/client 00:03:25.725 [31/36] Linking target test/unit_tests 00:03:25.725 [32/36] Linking target samples/shadow_ioeventfd_server 00:03:25.725 [33/36] Linking target samples/null 00:03:25.725 [34/36] Linking target samples/gpio-pci-idio-16 00:03:25.725 [35/36] Linking target samples/lspci 00:03:25.725 [36/36] Linking target samples/server 00:03:25.725 INFO: autodetecting backend as ninja 00:03:25.725 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:25.984 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:03:26.243 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:03:26.243 ninja: no work to do. 00:03:41.121 CC lib/ut/ut.o 00:03:41.121 CC lib/log/log.o 00:03:41.121 CC lib/log/log_flags.o 00:03:41.121 CC lib/log/log_deprecated.o 00:03:41.121 CC lib/ut_mock/mock.o 00:03:41.121 LIB libspdk_ut.a 00:03:41.121 LIB libspdk_log.a 00:03:41.121 LIB libspdk_ut_mock.a 00:03:41.121 CXX lib/trace_parser/trace.o 00:03:41.121 CC lib/dma/dma.o 00:03:41.121 CC lib/util/cpuset.o 00:03:41.121 CC lib/util/base64.o 00:03:41.121 CC lib/util/bit_array.o 00:03:41.121 CC lib/util/crc32.o 00:03:41.121 CC lib/util/crc16.o 00:03:41.121 CC lib/util/crc32c.o 00:03:41.121 CC lib/util/dif.o 00:03:41.121 CC lib/util/crc32_ieee.o 00:03:41.121 CC lib/util/crc64.o 00:03:41.121 CC lib/util/fd.o 00:03:41.121 CC lib/util/fd_group.o 00:03:41.121 CC lib/ioat/ioat.o 00:03:41.121 CC lib/util/file.o 00:03:41.121 CC lib/util/hexlify.o 00:03:41.121 CC lib/util/iov.o 00:03:41.121 CC lib/util/math.o 00:03:41.121 CC lib/util/strerror_tls.o 00:03:41.121 CC lib/util/net.o 00:03:41.121 CC lib/util/pipe.o 00:03:41.121 CC lib/util/uuid.o 00:03:41.121 CC lib/util/string.o 00:03:41.121 CC lib/util/xor.o 00:03:41.121 CC lib/util/zipf.o 00:03:41.121 CC lib/util/md5.o 00:03:41.121 CC lib/vfio_user/host/vfio_user_pci.o 00:03:41.121 CC lib/vfio_user/host/vfio_user.o 00:03:41.121 LIB libspdk_dma.a 00:03:41.121 LIB libspdk_ioat.a 00:03:41.121 LIB libspdk_vfio_user.a 00:03:41.121 LIB libspdk_util.a 00:03:41.121 LIB libspdk_trace_parser.a 00:03:41.121 CC lib/json/json_parse.o 00:03:41.121 CC lib/json/json_util.o 00:03:41.121 CC lib/json/json_write.o 00:03:41.121 CC lib/conf/conf.o 00:03:41.121 CC lib/idxd/idxd_kernel.o 00:03:41.121 CC lib/idxd/idxd.o 00:03:41.121 CC lib/idxd/idxd_user.o 00:03:41.121 CC lib/vmd/led.o 00:03:41.121 CC lib/vmd/vmd.o 00:03:41.121 CC lib/env_dpdk/env.o 00:03:41.121 CC lib/env_dpdk/memory.o 00:03:41.121 CC lib/env_dpdk/pci.o 00:03:41.121 CC lib/env_dpdk/init.o 00:03:41.121 CC lib/env_dpdk/threads.o 00:03:41.121 CC lib/rdma_utils/rdma_utils.o 00:03:41.121 CC lib/env_dpdk/pci_ioat.o 00:03:41.121 CC lib/env_dpdk/pci_virtio.o 00:03:41.121 CC lib/env_dpdk/pci_vmd.o 00:03:41.121 CC lib/env_dpdk/pci_idxd.o 00:03:41.121 CC lib/env_dpdk/pci_event.o 00:03:41.121 CC lib/env_dpdk/sigbus_handler.o 00:03:41.121 CC lib/env_dpdk/pci_dpdk.o 00:03:41.121 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:41.121 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:41.121 LIB libspdk_conf.a 00:03:41.121 LIB libspdk_json.a 00:03:41.121 LIB libspdk_rdma_utils.a 00:03:41.121 LIB libspdk_idxd.a 00:03:41.121 LIB libspdk_vmd.a 00:03:41.121 CC lib/jsonrpc/jsonrpc_client.o 00:03:41.121 CC lib/jsonrpc/jsonrpc_server.o 00:03:41.121 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:41.121 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:41.121 CC lib/rdma_provider/common.o 00:03:41.121 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:41.121 LIB libspdk_rdma_provider.a 00:03:41.121 LIB libspdk_jsonrpc.a 00:03:41.380 CC lib/rpc/rpc.o 00:03:41.640 LIB libspdk_env_dpdk.a 00:03:41.640 LIB libspdk_rpc.a 00:03:41.899 CC lib/keyring/keyring_rpc.o 00:03:41.899 CC lib/keyring/keyring.o 00:03:41.899 CC lib/notify/notify.o 00:03:41.899 CC lib/notify/notify_rpc.o 00:03:41.899 CC lib/trace/trace_flags.o 00:03:41.899 CC lib/trace/trace.o 00:03:41.899 CC lib/trace/trace_rpc.o 00:03:42.159 LIB libspdk_notify.a 00:03:42.159 LIB libspdk_keyring.a 00:03:42.159 LIB libspdk_trace.a 00:03:42.419 CC lib/sock/sock.o 00:03:42.419 CC lib/sock/sock_rpc.o 00:03:42.419 CC lib/thread/iobuf.o 00:03:42.419 CC lib/thread/thread.o 00:03:42.679 LIB libspdk_sock.a 00:03:42.938 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:42.938 CC lib/nvme/nvme_ctrlr.o 00:03:42.938 CC lib/nvme/nvme_fabric.o 00:03:42.938 CC lib/nvme/nvme_pcie_common.o 00:03:42.938 CC lib/nvme/nvme_ns_cmd.o 00:03:42.938 CC lib/nvme/nvme_ns.o 00:03:42.938 CC lib/nvme/nvme.o 00:03:42.938 CC lib/nvme/nvme_pcie.o 00:03:42.938 CC lib/nvme/nvme_qpair.o 00:03:42.938 CC lib/nvme/nvme_quirks.o 00:03:42.938 CC lib/nvme/nvme_transport.o 00:03:42.938 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:42.938 CC lib/nvme/nvme_discovery.o 00:03:42.938 CC lib/nvme/nvme_tcp.o 00:03:42.938 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:42.938 CC lib/nvme/nvme_opal.o 00:03:42.938 CC lib/nvme/nvme_io_msg.o 00:03:42.938 CC lib/nvme/nvme_poll_group.o 00:03:42.938 CC lib/nvme/nvme_zns.o 00:03:42.938 CC lib/nvme/nvme_stubs.o 00:03:42.938 CC lib/nvme/nvme_vfio_user.o 00:03:42.938 CC lib/nvme/nvme_auth.o 00:03:42.938 CC lib/nvme/nvme_cuse.o 00:03:42.938 CC lib/nvme/nvme_rdma.o 00:03:43.196 LIB libspdk_thread.a 00:03:43.454 CC lib/init/json_config.o 00:03:43.454 CC lib/init/subsystem.o 00:03:43.454 CC lib/init/rpc.o 00:03:43.454 CC lib/init/subsystem_rpc.o 00:03:43.454 CC lib/blob/blobstore.o 00:03:43.454 CC lib/accel/accel.o 00:03:43.454 CC lib/accel/accel_rpc.o 00:03:43.454 CC lib/blob/request.o 00:03:43.454 CC lib/blob/blob_bs_dev.o 00:03:43.454 CC lib/accel/accel_sw.o 00:03:43.454 CC lib/blob/zeroes.o 00:03:43.454 CC lib/vfu_tgt/tgt_endpoint.o 00:03:43.454 CC lib/vfu_tgt/tgt_rpc.o 00:03:43.454 CC lib/virtio/virtio_vfio_user.o 00:03:43.454 CC lib/virtio/virtio.o 00:03:43.454 CC lib/virtio/virtio_vhost_user.o 00:03:43.454 CC lib/virtio/virtio_pci.o 00:03:43.454 CC lib/fsdev/fsdev.o 00:03:43.454 CC lib/fsdev/fsdev_io.o 00:03:43.454 CC lib/fsdev/fsdev_rpc.o 00:03:43.713 LIB libspdk_init.a 00:03:43.713 LIB libspdk_virtio.a 00:03:43.713 LIB libspdk_vfu_tgt.a 00:03:43.972 LIB libspdk_fsdev.a 00:03:43.972 CC lib/event/reactor.o 00:03:43.972 CC lib/event/log_rpc.o 00:03:43.972 CC lib/event/app.o 00:03:43.972 CC lib/event/app_rpc.o 00:03:43.972 CC lib/event/scheduler_static.o 00:03:44.230 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:44.230 LIB libspdk_event.a 00:03:44.230 LIB libspdk_accel.a 00:03:44.230 LIB libspdk_nvme.a 00:03:44.488 LIB libspdk_fuse_dispatcher.a 00:03:44.488 CC lib/bdev/bdev.o 00:03:44.488 CC lib/bdev/bdev_rpc.o 00:03:44.488 CC lib/bdev/scsi_nvme.o 00:03:44.488 CC lib/bdev/bdev_zone.o 00:03:44.488 CC lib/bdev/part.o 00:03:45.424 LIB libspdk_blob.a 00:03:45.682 CC lib/blobfs/blobfs.o 00:03:45.682 CC lib/blobfs/tree.o 00:03:45.682 CC lib/lvol/lvol.o 00:03:46.249 LIB libspdk_lvol.a 00:03:46.249 LIB libspdk_blobfs.a 00:03:46.507 LIB libspdk_bdev.a 00:03:46.772 CC lib/scsi/dev.o 00:03:46.772 CC lib/scsi/lun.o 00:03:46.773 CC lib/scsi/port.o 00:03:46.773 CC lib/scsi/scsi.o 00:03:46.773 CC lib/scsi/scsi_bdev.o 00:03:46.773 CC lib/scsi/scsi_rpc.o 00:03:46.773 CC lib/scsi/scsi_pr.o 00:03:46.773 CC lib/scsi/task.o 00:03:46.773 CC lib/nvmf/ctrlr.o 00:03:46.773 CC lib/nvmf/ctrlr_discovery.o 00:03:46.773 CC lib/nvmf/ctrlr_bdev.o 00:03:46.773 CC lib/nvmf/subsystem.o 00:03:46.773 CC lib/nvmf/nvmf.o 00:03:46.773 CC lib/nvmf/transport.o 00:03:46.773 CC lib/nvmf/tcp.o 00:03:46.773 CC lib/nvmf/nvmf_rpc.o 00:03:46.773 CC lib/ftl/ftl_core.o 00:03:46.773 CC lib/ublk/ublk.o 00:03:46.773 CC lib/ublk/ublk_rpc.o 00:03:46.773 CC lib/ftl/ftl_init.o 00:03:46.773 CC lib/ftl/ftl_layout.o 00:03:46.773 CC lib/nvmf/stubs.o 00:03:46.773 CC lib/nvmf/mdns_server.o 00:03:46.773 CC lib/ftl/ftl_debug.o 00:03:46.773 CC lib/nvmf/vfio_user.o 00:03:46.773 CC lib/ftl/ftl_sb.o 00:03:46.773 CC lib/ftl/ftl_io.o 00:03:46.773 CC lib/nvmf/rdma.o 00:03:46.773 CC lib/nvmf/auth.o 00:03:46.773 CC lib/ftl/ftl_l2p_flat.o 00:03:46.773 CC lib/ftl/ftl_l2p.o 00:03:46.773 CC lib/nbd/nbd.o 00:03:46.773 CC lib/nbd/nbd_rpc.o 00:03:46.773 CC lib/ftl/ftl_nv_cache.o 00:03:46.773 CC lib/ftl/ftl_band.o 00:03:46.773 CC lib/ftl/ftl_band_ops.o 00:03:46.773 CC lib/ftl/ftl_writer.o 00:03:46.773 CC lib/ftl/ftl_rq.o 00:03:46.773 CC lib/ftl/ftl_l2p_cache.o 00:03:46.773 CC lib/ftl/ftl_reloc.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt.o 00:03:46.773 CC lib/ftl/ftl_p2l.o 00:03:46.773 CC lib/ftl/ftl_p2l_log.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:46.773 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:46.773 CC lib/ftl/utils/ftl_conf.o 00:03:46.773 CC lib/ftl/utils/ftl_mempool.o 00:03:46.773 CC lib/ftl/utils/ftl_md.o 00:03:46.773 CC lib/ftl/utils/ftl_property.o 00:03:46.773 CC lib/ftl/utils/ftl_bitmap.o 00:03:46.773 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:46.773 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:46.773 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:46.773 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:46.773 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:46.773 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:46.773 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:47.032 CC lib/ftl/base/ftl_base_dev.o 00:03:47.032 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:47.032 CC lib/ftl/base/ftl_base_bdev.o 00:03:47.032 CC lib/ftl/ftl_trace.o 00:03:47.292 LIB libspdk_scsi.a 00:03:47.292 LIB libspdk_nbd.a 00:03:47.292 LIB libspdk_ublk.a 00:03:47.551 CC lib/vhost/vhost_scsi.o 00:03:47.551 CC lib/vhost/vhost.o 00:03:47.551 CC lib/vhost/vhost_rpc.o 00:03:47.551 CC lib/vhost/vhost_blk.o 00:03:47.551 CC lib/vhost/rte_vhost_user.o 00:03:47.551 CC lib/iscsi/conn.o 00:03:47.551 CC lib/iscsi/init_grp.o 00:03:47.551 CC lib/iscsi/portal_grp.o 00:03:47.551 CC lib/iscsi/iscsi.o 00:03:47.551 CC lib/iscsi/param.o 00:03:47.551 CC lib/iscsi/tgt_node.o 00:03:47.551 CC lib/iscsi/task.o 00:03:47.551 CC lib/iscsi/iscsi_subsystem.o 00:03:47.551 CC lib/iscsi/iscsi_rpc.o 00:03:47.551 LIB libspdk_ftl.a 00:03:48.121 LIB libspdk_nvmf.a 00:03:48.121 LIB libspdk_vhost.a 00:03:48.380 LIB libspdk_iscsi.a 00:03:48.638 CC module/vfu_device/vfu_virtio_blk.o 00:03:48.638 CC module/vfu_device/vfu_virtio_scsi.o 00:03:48.638 CC module/vfu_device/vfu_virtio.o 00:03:48.638 CC module/vfu_device/vfu_virtio_rpc.o 00:03:48.638 CC module/vfu_device/vfu_virtio_fs.o 00:03:48.638 CC module/env_dpdk/env_dpdk_rpc.o 00:03:48.897 CC module/fsdev/aio/fsdev_aio.o 00:03:48.897 CC module/fsdev/aio/linux_aio_mgr.o 00:03:48.897 CC module/fsdev/aio/fsdev_aio_rpc.o 00:03:48.897 CC module/scheduler/gscheduler/gscheduler.o 00:03:48.897 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:48.897 CC module/keyring/linux/keyring.o 00:03:48.897 CC module/blob/bdev/blob_bdev.o 00:03:48.897 CC module/keyring/linux/keyring_rpc.o 00:03:48.897 CC module/keyring/file/keyring.o 00:03:48.897 LIB libspdk_env_dpdk_rpc.a 00:03:48.897 CC module/keyring/file/keyring_rpc.o 00:03:48.897 CC module/accel/dsa/accel_dsa.o 00:03:48.897 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:48.897 CC module/accel/dsa/accel_dsa_rpc.o 00:03:48.897 CC module/sock/posix/posix.o 00:03:48.897 CC module/accel/error/accel_error.o 00:03:48.897 CC module/accel/ioat/accel_ioat.o 00:03:48.897 CC module/accel/error/accel_error_rpc.o 00:03:48.897 CC module/accel/ioat/accel_ioat_rpc.o 00:03:48.897 CC module/accel/iaa/accel_iaa.o 00:03:48.897 CC module/accel/iaa/accel_iaa_rpc.o 00:03:48.897 LIB libspdk_scheduler_gscheduler.a 00:03:48.897 LIB libspdk_keyring_linux.a 00:03:48.897 LIB libspdk_keyring_file.a 00:03:48.897 LIB libspdk_scheduler_dynamic.a 00:03:48.897 LIB libspdk_scheduler_dpdk_governor.a 00:03:48.897 LIB libspdk_accel_error.a 00:03:49.155 LIB libspdk_blob_bdev.a 00:03:49.155 LIB libspdk_accel_ioat.a 00:03:49.155 LIB libspdk_accel_iaa.a 00:03:49.155 LIB libspdk_accel_dsa.a 00:03:49.155 LIB libspdk_vfu_device.a 00:03:49.414 LIB libspdk_fsdev_aio.a 00:03:49.414 LIB libspdk_sock_posix.a 00:03:49.414 CC module/bdev/aio/bdev_aio_rpc.o 00:03:49.414 CC module/bdev/aio/bdev_aio.o 00:03:49.414 CC module/bdev/passthru/vbdev_passthru.o 00:03:49.414 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:49.414 CC module/bdev/delay/vbdev_delay.o 00:03:49.414 CC module/bdev/gpt/gpt.o 00:03:49.414 CC module/bdev/error/vbdev_error_rpc.o 00:03:49.414 CC module/bdev/error/vbdev_error.o 00:03:49.414 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:49.414 CC module/bdev/gpt/vbdev_gpt.o 00:03:49.414 CC module/bdev/lvol/vbdev_lvol.o 00:03:49.414 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:49.414 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:49.414 CC module/bdev/nvme/bdev_mdns_client.o 00:03:49.414 CC module/bdev/nvme/nvme_rpc.o 00:03:49.414 CC module/bdev/nvme/bdev_nvme.o 00:03:49.414 CC module/bdev/nvme/vbdev_opal.o 00:03:49.414 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:49.414 CC module/bdev/null/bdev_null.o 00:03:49.414 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:49.414 CC module/bdev/ftl/bdev_ftl.o 00:03:49.414 CC module/bdev/raid/bdev_raid_rpc.o 00:03:49.414 CC module/bdev/null/bdev_null_rpc.o 00:03:49.414 CC module/bdev/raid/bdev_raid.o 00:03:49.414 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:49.414 CC module/bdev/raid/bdev_raid_sb.o 00:03:49.414 CC module/bdev/raid/raid0.o 00:03:49.414 CC module/bdev/raid/raid1.o 00:03:49.414 CC module/bdev/raid/concat.o 00:03:49.414 CC module/blobfs/bdev/blobfs_bdev.o 00:03:49.414 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:49.414 CC module/bdev/malloc/bdev_malloc.o 00:03:49.414 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:49.414 CC module/bdev/split/vbdev_split.o 00:03:49.414 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:49.414 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:49.414 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:49.414 CC module/bdev/split/vbdev_split_rpc.o 00:03:49.414 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:49.414 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:49.414 CC module/bdev/iscsi/bdev_iscsi.o 00:03:49.414 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:49.673 LIB libspdk_bdev_error.a 00:03:49.673 LIB libspdk_bdev_gpt.a 00:03:49.673 LIB libspdk_bdev_null.a 00:03:49.673 LIB libspdk_bdev_passthru.a 00:03:49.673 LIB libspdk_bdev_ftl.a 00:03:49.673 LIB libspdk_blobfs_bdev.a 00:03:49.673 LIB libspdk_bdev_split.a 00:03:49.673 LIB libspdk_bdev_aio.a 00:03:49.673 LIB libspdk_bdev_delay.a 00:03:49.673 LIB libspdk_bdev_malloc.a 00:03:49.673 LIB libspdk_bdev_iscsi.a 00:03:49.673 LIB libspdk_bdev_zone_block.a 00:03:49.932 LIB libspdk_bdev_lvol.a 00:03:49.932 LIB libspdk_bdev_virtio.a 00:03:50.191 LIB libspdk_bdev_raid.a 00:03:51.129 LIB libspdk_bdev_nvme.a 00:03:51.696 CC module/event/subsystems/scheduler/scheduler.o 00:03:51.696 CC module/event/subsystems/keyring/keyring.o 00:03:51.696 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:51.696 CC module/event/subsystems/vmd/vmd.o 00:03:51.696 CC module/event/subsystems/sock/sock.o 00:03:51.696 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:03:51.696 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:51.696 CC module/event/subsystems/iobuf/iobuf.o 00:03:51.696 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:51.696 CC module/event/subsystems/fsdev/fsdev.o 00:03:51.696 LIB libspdk_event_keyring.a 00:03:51.696 LIB libspdk_event_vfu_tgt.a 00:03:51.696 LIB libspdk_event_vmd.a 00:03:51.696 LIB libspdk_event_scheduler.a 00:03:51.696 LIB libspdk_event_vhost_blk.a 00:03:51.696 LIB libspdk_event_sock.a 00:03:51.696 LIB libspdk_event_iobuf.a 00:03:51.696 LIB libspdk_event_fsdev.a 00:03:51.954 CC module/event/subsystems/accel/accel.o 00:03:52.213 LIB libspdk_event_accel.a 00:03:52.472 CC module/event/subsystems/bdev/bdev.o 00:03:52.472 LIB libspdk_event_bdev.a 00:03:52.731 CC module/event/subsystems/scsi/scsi.o 00:03:52.731 CC module/event/subsystems/ublk/ublk.o 00:03:52.731 CC module/event/subsystems/nbd/nbd.o 00:03:52.731 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:52.731 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:52.989 LIB libspdk_event_ublk.a 00:03:52.989 LIB libspdk_event_nbd.a 00:03:52.989 LIB libspdk_event_scsi.a 00:03:52.989 LIB libspdk_event_nvmf.a 00:03:53.299 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:53.299 CC module/event/subsystems/iscsi/iscsi.o 00:03:53.299 LIB libspdk_event_vhost_scsi.a 00:03:53.300 LIB libspdk_event_iscsi.a 00:03:53.558 CC app/spdk_lspci/spdk_lspci.o 00:03:53.558 CC app/spdk_nvme_identify/identify.o 00:03:53.558 TEST_HEADER include/spdk/accel.h 00:03:53.558 CC app/spdk_top/spdk_top.o 00:03:53.558 TEST_HEADER include/spdk/base64.h 00:03:53.558 TEST_HEADER include/spdk/accel_module.h 00:03:53.558 TEST_HEADER include/spdk/barrier.h 00:03:53.558 TEST_HEADER include/spdk/assert.h 00:03:53.558 TEST_HEADER include/spdk/bdev_module.h 00:03:53.558 TEST_HEADER include/spdk/bdev_zone.h 00:03:53.558 CC app/trace_record/trace_record.o 00:03:53.558 TEST_HEADER include/spdk/bit_array.h 00:03:53.558 TEST_HEADER include/spdk/bdev.h 00:03:53.558 TEST_HEADER include/spdk/bit_pool.h 00:03:53.558 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:53.558 TEST_HEADER include/spdk/blob_bdev.h 00:03:53.558 TEST_HEADER include/spdk/blob.h 00:03:53.558 TEST_HEADER include/spdk/conf.h 00:03:53.558 TEST_HEADER include/spdk/config.h 00:03:53.558 TEST_HEADER include/spdk/blobfs.h 00:03:53.558 TEST_HEADER include/spdk/cpuset.h 00:03:53.558 TEST_HEADER include/spdk/crc16.h 00:03:53.558 CXX app/trace/trace.o 00:03:53.558 TEST_HEADER include/spdk/crc32.h 00:03:53.558 TEST_HEADER include/spdk/dif.h 00:03:53.559 TEST_HEADER include/spdk/crc64.h 00:03:53.559 TEST_HEADER include/spdk/endian.h 00:03:53.559 TEST_HEADER include/spdk/dma.h 00:03:53.559 TEST_HEADER include/spdk/env_dpdk.h 00:03:53.559 TEST_HEADER include/spdk/env.h 00:03:53.559 CC app/spdk_nvme_perf/perf.o 00:03:53.559 TEST_HEADER include/spdk/fd_group.h 00:03:53.559 TEST_HEADER include/spdk/fd.h 00:03:53.559 TEST_HEADER include/spdk/fsdev.h 00:03:53.559 TEST_HEADER include/spdk/file.h 00:03:53.559 TEST_HEADER include/spdk/fsdev_module.h 00:03:53.559 TEST_HEADER include/spdk/event.h 00:03:53.559 TEST_HEADER include/spdk/ftl.h 00:03:53.559 CC app/spdk_nvme_discover/discovery_aer.o 00:03:53.559 TEST_HEADER include/spdk/fuse_dispatcher.h 00:03:53.559 TEST_HEADER include/spdk/gpt_spec.h 00:03:53.559 TEST_HEADER include/spdk/histogram_data.h 00:03:53.559 TEST_HEADER include/spdk/hexlify.h 00:03:53.559 TEST_HEADER include/spdk/idxd.h 00:03:53.559 TEST_HEADER include/spdk/ioat.h 00:03:53.559 TEST_HEADER include/spdk/idxd_spec.h 00:03:53.824 TEST_HEADER include/spdk/init.h 00:03:53.824 TEST_HEADER include/spdk/ioat_spec.h 00:03:53.824 TEST_HEADER include/spdk/iscsi_spec.h 00:03:53.824 TEST_HEADER include/spdk/json.h 00:03:53.824 TEST_HEADER include/spdk/jsonrpc.h 00:03:53.824 TEST_HEADER include/spdk/keyring.h 00:03:53.824 TEST_HEADER include/spdk/keyring_module.h 00:03:53.824 TEST_HEADER include/spdk/likely.h 00:03:53.824 TEST_HEADER include/spdk/log.h 00:03:53.824 CC test/rpc_client/rpc_client_test.o 00:03:53.824 TEST_HEADER include/spdk/lvol.h 00:03:53.824 TEST_HEADER include/spdk/md5.h 00:03:53.824 TEST_HEADER include/spdk/mmio.h 00:03:53.824 TEST_HEADER include/spdk/memory.h 00:03:53.824 TEST_HEADER include/spdk/nbd.h 00:03:53.824 TEST_HEADER include/spdk/net.h 00:03:53.824 TEST_HEADER include/spdk/notify.h 00:03:53.824 TEST_HEADER include/spdk/nvme.h 00:03:53.824 TEST_HEADER include/spdk/nvme_intel.h 00:03:53.824 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:53.824 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:53.824 TEST_HEADER include/spdk/nvme_spec.h 00:03:53.824 TEST_HEADER include/spdk/nvme_zns.h 00:03:53.824 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:53.824 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:53.824 TEST_HEADER include/spdk/nvmf.h 00:03:53.824 TEST_HEADER include/spdk/nvmf_spec.h 00:03:53.824 TEST_HEADER include/spdk/nvmf_transport.h 00:03:53.824 TEST_HEADER include/spdk/opal.h 00:03:53.824 TEST_HEADER include/spdk/opal_spec.h 00:03:53.824 TEST_HEADER include/spdk/pci_ids.h 00:03:53.824 TEST_HEADER include/spdk/pipe.h 00:03:53.824 TEST_HEADER include/spdk/queue.h 00:03:53.824 TEST_HEADER include/spdk/reduce.h 00:03:53.824 TEST_HEADER include/spdk/rpc.h 00:03:53.824 TEST_HEADER include/spdk/scheduler.h 00:03:53.824 TEST_HEADER include/spdk/scsi.h 00:03:53.824 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:53.824 TEST_HEADER include/spdk/scsi_spec.h 00:03:53.824 TEST_HEADER include/spdk/sock.h 00:03:53.824 CC app/nvmf_tgt/nvmf_main.o 00:03:53.824 TEST_HEADER include/spdk/stdinc.h 00:03:53.824 TEST_HEADER include/spdk/thread.h 00:03:53.824 TEST_HEADER include/spdk/string.h 00:03:53.824 TEST_HEADER include/spdk/trace.h 00:03:53.824 TEST_HEADER include/spdk/trace_parser.h 00:03:53.824 TEST_HEADER include/spdk/tree.h 00:03:53.824 TEST_HEADER include/spdk/ublk.h 00:03:53.824 TEST_HEADER include/spdk/util.h 00:03:53.824 TEST_HEADER include/spdk/uuid.h 00:03:53.824 TEST_HEADER include/spdk/version.h 00:03:53.824 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:53.824 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:53.824 TEST_HEADER include/spdk/vhost.h 00:03:53.824 TEST_HEADER include/spdk/xor.h 00:03:53.824 TEST_HEADER include/spdk/vmd.h 00:03:53.824 TEST_HEADER include/spdk/zipf.h 00:03:53.824 CXX test/cpp_headers/accel.o 00:03:53.824 CXX test/cpp_headers/accel_module.o 00:03:53.824 CC app/spdk_dd/spdk_dd.o 00:03:53.824 CXX test/cpp_headers/assert.o 00:03:53.824 CXX test/cpp_headers/barrier.o 00:03:53.824 CXX test/cpp_headers/base64.o 00:03:53.824 CXX test/cpp_headers/bdev.o 00:03:53.824 CXX test/cpp_headers/bdev_module.o 00:03:53.824 CXX test/cpp_headers/bdev_zone.o 00:03:53.824 CXX test/cpp_headers/bit_array.o 00:03:53.824 CXX test/cpp_headers/blob_bdev.o 00:03:53.824 CXX test/cpp_headers/blobfs_bdev.o 00:03:53.824 CXX test/cpp_headers/bit_pool.o 00:03:53.824 CXX test/cpp_headers/blobfs.o 00:03:53.824 CXX test/cpp_headers/blob.o 00:03:53.824 CXX test/cpp_headers/conf.o 00:03:53.824 CXX test/cpp_headers/cpuset.o 00:03:53.824 CXX test/cpp_headers/config.o 00:03:53.824 CXX test/cpp_headers/crc16.o 00:03:53.824 CXX test/cpp_headers/crc32.o 00:03:53.824 CXX test/cpp_headers/crc64.o 00:03:53.824 CXX test/cpp_headers/dif.o 00:03:53.824 CXX test/cpp_headers/dma.o 00:03:53.824 CXX test/cpp_headers/endian.o 00:03:53.824 CC test/app/jsoncat/jsoncat.o 00:03:53.824 CXX test/cpp_headers/env_dpdk.o 00:03:53.824 CC app/iscsi_tgt/iscsi_tgt.o 00:03:53.824 CXX test/cpp_headers/env.o 00:03:53.824 CXX test/cpp_headers/event.o 00:03:53.824 CXX test/cpp_headers/fd_group.o 00:03:53.824 CXX test/cpp_headers/fd.o 00:03:53.824 CXX test/cpp_headers/file.o 00:03:53.824 CXX test/cpp_headers/fsdev.o 00:03:53.824 CXX test/cpp_headers/fsdev_module.o 00:03:53.824 CXX test/cpp_headers/ftl.o 00:03:53.824 CXX test/cpp_headers/gpt_spec.o 00:03:53.824 CXX test/cpp_headers/fuse_dispatcher.o 00:03:53.824 CXX test/cpp_headers/hexlify.o 00:03:53.824 CXX test/cpp_headers/histogram_data.o 00:03:53.824 CXX test/cpp_headers/idxd.o 00:03:53.824 CC test/app/stub/stub.o 00:03:53.824 CC test/app/histogram_perf/histogram_perf.o 00:03:53.824 CC test/thread/lock/spdk_lock.o 00:03:53.824 CC examples/ioat/perf/perf.o 00:03:53.824 CC examples/ioat/verify/verify.o 00:03:53.824 CC examples/util/zipf/zipf.o 00:03:53.824 CC test/thread/poller_perf/poller_perf.o 00:03:53.824 CC test/env/pci/pci_ut.o 00:03:53.824 CC test/env/memory/memory_ut.o 00:03:53.824 CXX test/cpp_headers/idxd_spec.o 00:03:53.824 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:53.824 CC app/fio/nvme/fio_plugin.o 00:03:53.824 CC test/env/vtophys/vtophys.o 00:03:53.824 CC app/spdk_tgt/spdk_tgt.o 00:03:53.824 LINK spdk_lspci 00:03:53.824 CC test/app/bdev_svc/bdev_svc.o 00:03:53.824 CC test/dma/test_dma/test_dma.o 00:03:53.824 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:53.824 CC app/fio/bdev/fio_plugin.o 00:03:53.824 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:53.824 CC test/env/mem_callbacks/mem_callbacks.o 00:03:53.824 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:53.824 LINK spdk_nvme_discover 00:03:53.824 LINK rpc_client_test 00:03:53.824 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:03:53.824 LINK jsoncat 00:03:53.824 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:53.824 CXX test/cpp_headers/init.o 00:03:53.824 CXX test/cpp_headers/ioat.o 00:03:53.824 CXX test/cpp_headers/ioat_spec.o 00:03:53.824 CXX test/cpp_headers/iscsi_spec.o 00:03:53.824 CXX test/cpp_headers/json.o 00:03:53.824 CXX test/cpp_headers/jsonrpc.o 00:03:53.824 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:03:53.824 LINK histogram_perf 00:03:53.824 CXX test/cpp_headers/keyring_module.o 00:03:53.824 CXX test/cpp_headers/keyring.o 00:03:54.083 LINK spdk_trace_record 00:03:54.083 CXX test/cpp_headers/likely.o 00:03:54.083 CXX test/cpp_headers/log.o 00:03:54.083 CXX test/cpp_headers/lvol.o 00:03:54.083 CXX test/cpp_headers/md5.o 00:03:54.083 CXX test/cpp_headers/memory.o 00:03:54.083 CXX test/cpp_headers/mmio.o 00:03:54.083 CXX test/cpp_headers/nbd.o 00:03:54.083 CXX test/cpp_headers/net.o 00:03:54.083 CXX test/cpp_headers/notify.o 00:03:54.083 CXX test/cpp_headers/nvme.o 00:03:54.083 CXX test/cpp_headers/nvme_intel.o 00:03:54.083 CXX test/cpp_headers/nvme_ocssd.o 00:03:54.083 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:54.083 CXX test/cpp_headers/nvme_spec.o 00:03:54.083 CXX test/cpp_headers/nvme_zns.o 00:03:54.083 CXX test/cpp_headers/nvmf_cmd.o 00:03:54.083 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:54.083 CXX test/cpp_headers/nvmf.o 00:03:54.083 LINK poller_perf 00:03:54.083 CXX test/cpp_headers/nvmf_spec.o 00:03:54.083 CXX test/cpp_headers/nvmf_transport.o 00:03:54.083 LINK zipf 00:03:54.083 LINK nvmf_tgt 00:03:54.083 LINK interrupt_tgt 00:03:54.083 CXX test/cpp_headers/opal.o 00:03:54.083 CXX test/cpp_headers/opal_spec.o 00:03:54.083 CXX test/cpp_headers/pci_ids.o 00:03:54.083 LINK stub 00:03:54.083 CXX test/cpp_headers/pipe.o 00:03:54.083 CXX test/cpp_headers/queue.o 00:03:54.083 LINK vtophys 00:03:54.083 CXX test/cpp_headers/reduce.o 00:03:54.083 LINK env_dpdk_post_init 00:03:54.083 CXX test/cpp_headers/rpc.o 00:03:54.083 CXX test/cpp_headers/scheduler.o 00:03:54.083 CXX test/cpp_headers/scsi.o 00:03:54.083 LINK iscsi_tgt 00:03:54.083 LINK verify 00:03:54.083 CXX test/cpp_headers/scsi_spec.o 00:03:54.083 LINK ioat_perf 00:03:54.083 CXX test/cpp_headers/sock.o 00:03:54.083 LINK bdev_svc 00:03:54.083 LINK spdk_tgt 00:03:54.083 CXX test/cpp_headers/stdinc.o 00:03:54.083 CXX test/cpp_headers/string.o 00:03:54.083 CXX test/cpp_headers/thread.o 00:03:54.083 CXX test/cpp_headers/trace.o 00:03:54.083 CXX test/cpp_headers/trace_parser.o 00:03:54.083 CXX test/cpp_headers/tree.o 00:03:54.083 CXX test/cpp_headers/ublk.o 00:03:54.083 CXX test/cpp_headers/util.o 00:03:54.083 CXX test/cpp_headers/uuid.o 00:03:54.083 CXX test/cpp_headers/version.o 00:03:54.083 CXX test/cpp_headers/vfio_user_pci.o 00:03:54.083 CXX test/cpp_headers/vfio_user_spec.o 00:03:54.083 LINK spdk_trace 00:03:54.083 CXX test/cpp_headers/vhost.o 00:03:54.083 CXX test/cpp_headers/vmd.o 00:03:54.083 CXX test/cpp_headers/xor.o 00:03:54.083 CXX test/cpp_headers/zipf.o 00:03:54.342 LINK llvm_vfio_fuzz 00:03:54.342 LINK nvme_fuzz 00:03:54.342 LINK spdk_dd 00:03:54.342 LINK pci_ut 00:03:54.342 LINK test_dma 00:03:54.342 LINK spdk_nvme 00:03:54.342 LINK vhost_fuzz 00:03:54.600 LINK spdk_nvme_perf 00:03:54.600 LINK spdk_bdev 00:03:54.600 LINK spdk_nvme_identify 00:03:54.600 LINK mem_callbacks 00:03:54.600 LINK spdk_top 00:03:54.600 LINK llvm_nvme_fuzz 00:03:54.600 CC app/vhost/vhost.o 00:03:54.600 CC examples/vmd/led/led.o 00:03:54.600 CC examples/vmd/lsvmd/lsvmd.o 00:03:54.600 CC examples/idxd/perf/perf.o 00:03:54.858 CC examples/sock/hello_world/hello_sock.o 00:03:54.858 CC examples/thread/thread/thread_ex.o 00:03:54.858 LINK lsvmd 00:03:54.858 LINK led 00:03:54.858 LINK vhost 00:03:54.858 LINK memory_ut 00:03:54.858 LINK hello_sock 00:03:54.858 LINK idxd_perf 00:03:55.116 LINK thread 00:03:55.116 LINK spdk_lock 00:03:55.116 LINK iscsi_fuzz 00:03:55.683 CC examples/nvme/arbitration/arbitration.o 00:03:55.683 CC examples/nvme/abort/abort.o 00:03:55.683 CC examples/nvme/hotplug/hotplug.o 00:03:55.683 CC examples/nvme/reconnect/reconnect.o 00:03:55.683 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:55.683 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:55.683 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:55.683 CC examples/nvme/hello_world/hello_world.o 00:03:55.683 CC test/event/event_perf/event_perf.o 00:03:55.683 CC test/event/reactor/reactor.o 00:03:55.683 CC test/event/reactor_perf/reactor_perf.o 00:03:55.683 CC test/event/app_repeat/app_repeat.o 00:03:55.683 CC test/event/scheduler/scheduler.o 00:03:55.941 LINK pmr_persistence 00:03:55.941 LINK cmb_copy 00:03:55.941 LINK hotplug 00:03:55.941 LINK hello_world 00:03:55.941 LINK reactor 00:03:55.941 LINK reactor_perf 00:03:55.941 LINK event_perf 00:03:55.941 LINK arbitration 00:03:55.941 LINK reconnect 00:03:55.941 LINK abort 00:03:55.941 LINK app_repeat 00:03:55.941 LINK nvme_manage 00:03:55.941 LINK scheduler 00:03:55.941 CC test/nvme/aer/aer.o 00:03:55.941 CC test/nvme/startup/startup.o 00:03:55.941 CC test/nvme/e2edp/nvme_dp.o 00:03:55.941 CC test/nvme/reset/reset.o 00:03:55.941 CC test/nvme/cuse/cuse.o 00:03:55.941 CC test/nvme/compliance/nvme_compliance.o 00:03:55.941 CC test/nvme/overhead/overhead.o 00:03:55.941 CC test/nvme/simple_copy/simple_copy.o 00:03:55.941 CC test/nvme/fdp/fdp.o 00:03:55.941 CC test/nvme/reserve/reserve.o 00:03:55.941 CC test/nvme/connect_stress/connect_stress.o 00:03:55.941 CC test/nvme/err_injection/err_injection.o 00:03:55.941 CC test/nvme/sgl/sgl.o 00:03:55.941 CC test/nvme/boot_partition/boot_partition.o 00:03:55.941 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:55.941 CC test/nvme/fused_ordering/fused_ordering.o 00:03:55.941 CC test/blobfs/mkfs/mkfs.o 00:03:56.200 CC test/accel/dif/dif.o 00:03:56.200 CC test/lvol/esnap/esnap.o 00:03:56.200 LINK connect_stress 00:03:56.200 LINK startup 00:03:56.200 LINK boot_partition 00:03:56.200 LINK reserve 00:03:56.200 LINK err_injection 00:03:56.200 LINK doorbell_aers 00:03:56.200 LINK fused_ordering 00:03:56.200 LINK simple_copy 00:03:56.200 LINK aer 00:03:56.200 LINK nvme_dp 00:03:56.200 LINK reset 00:03:56.200 LINK sgl 00:03:56.200 LINK fdp 00:03:56.200 LINK mkfs 00:03:56.458 LINK nvme_compliance 00:03:56.458 LINK overhead 00:03:56.458 LINK dif 00:03:57.027 CC examples/accel/perf/accel_perf.o 00:03:57.027 CC examples/blob/hello_world/hello_blob.o 00:03:57.027 CC examples/fsdev/hello_world/hello_fsdev.o 00:03:57.027 CC examples/blob/cli/blobcli.o 00:03:57.027 LINK hello_blob 00:03:57.027 LINK cuse 00:03:57.027 LINK hello_fsdev 00:03:57.027 LINK accel_perf 00:03:57.286 LINK blobcli 00:03:57.854 CC examples/bdev/bdevperf/bdevperf.o 00:03:57.854 CC examples/bdev/hello_world/hello_bdev.o 00:03:58.113 LINK hello_bdev 00:03:58.113 CC test/bdev/bdevio/bdevio.o 00:03:58.373 LINK bdevio 00:03:58.373 LINK bdevperf 00:03:59.754 LINK esnap 00:04:00.013 CC examples/nvmf/nvmf/nvmf.o 00:04:00.276 LINK nvmf 00:04:01.482 00:04:01.482 real 0m38.366s 00:04:01.482 user 5m11.324s 00:04:01.482 sys 1m36.011s 00:04:01.482 12:34:31 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:01.482 12:34:31 make -- common/autotest_common.sh@10 -- $ set +x 00:04:01.482 ************************************ 00:04:01.482 END TEST make 00:04:01.482 ************************************ 00:04:01.482 12:34:31 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:01.482 12:34:31 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:01.482 12:34:31 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:01.482 12:34:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.482 12:34:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:04:01.482 12:34:31 -- pm/common@44 -- $ pid=478650 00:04:01.482 12:34:31 -- pm/common@50 -- $ kill -TERM 478650 00:04:01.482 12:34:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.482 12:34:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:04:01.482 12:34:31 -- pm/common@44 -- $ pid=478651 00:04:01.482 12:34:31 -- pm/common@50 -- $ kill -TERM 478651 00:04:01.482 12:34:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.482 12:34:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:04:01.482 12:34:31 -- pm/common@44 -- $ pid=478653 00:04:01.482 12:34:31 -- pm/common@50 -- $ kill -TERM 478653 00:04:01.482 12:34:31 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.482 12:34:31 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:04:01.482 12:34:31 -- pm/common@44 -- $ pid=478677 00:04:01.482 12:34:31 -- pm/common@50 -- $ sudo -E kill -TERM 478677 00:04:01.482 12:34:31 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:01.482 12:34:31 -- spdk/autorun.sh@27 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:04:01.786 12:34:31 -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:01.786 12:34:31 -- common/autotest_common.sh@1693 -- # lcov --version 00:04:01.786 12:34:31 -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:01.786 12:34:31 -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:01.786 12:34:31 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:01.786 12:34:31 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:01.786 12:34:31 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:01.786 12:34:31 -- scripts/common.sh@336 -- # IFS=.-: 00:04:01.786 12:34:31 -- scripts/common.sh@336 -- # read -ra ver1 00:04:01.786 12:34:31 -- scripts/common.sh@337 -- # IFS=.-: 00:04:01.786 12:34:31 -- scripts/common.sh@337 -- # read -ra ver2 00:04:01.786 12:34:31 -- scripts/common.sh@338 -- # local 'op=<' 00:04:01.786 12:34:31 -- scripts/common.sh@340 -- # ver1_l=2 00:04:01.786 12:34:31 -- scripts/common.sh@341 -- # ver2_l=1 00:04:01.786 12:34:31 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:01.786 12:34:31 -- scripts/common.sh@344 -- # case "$op" in 00:04:01.786 12:34:31 -- scripts/common.sh@345 -- # : 1 00:04:01.786 12:34:31 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:01.786 12:34:31 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:01.786 12:34:31 -- scripts/common.sh@365 -- # decimal 1 00:04:01.786 12:34:31 -- scripts/common.sh@353 -- # local d=1 00:04:01.786 12:34:31 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:01.786 12:34:31 -- scripts/common.sh@355 -- # echo 1 00:04:01.786 12:34:31 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:01.786 12:34:31 -- scripts/common.sh@366 -- # decimal 2 00:04:01.786 12:34:31 -- scripts/common.sh@353 -- # local d=2 00:04:01.786 12:34:31 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:01.786 12:34:31 -- scripts/common.sh@355 -- # echo 2 00:04:01.786 12:34:31 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:01.786 12:34:31 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:01.786 12:34:31 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:01.786 12:34:31 -- scripts/common.sh@368 -- # return 0 00:04:01.786 12:34:31 -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:01.786 12:34:31 -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:01.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:01.786 --rc genhtml_branch_coverage=1 00:04:01.786 --rc genhtml_function_coverage=1 00:04:01.786 --rc genhtml_legend=1 00:04:01.786 --rc geninfo_all_blocks=1 00:04:01.786 --rc geninfo_unexecuted_blocks=1 00:04:01.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:01.786 ' 00:04:01.786 12:34:31 -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:01.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:01.786 --rc genhtml_branch_coverage=1 00:04:01.786 --rc genhtml_function_coverage=1 00:04:01.786 --rc genhtml_legend=1 00:04:01.786 --rc geninfo_all_blocks=1 00:04:01.786 --rc geninfo_unexecuted_blocks=1 00:04:01.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:01.786 ' 00:04:01.786 12:34:31 -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:01.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:01.786 --rc genhtml_branch_coverage=1 00:04:01.786 --rc genhtml_function_coverage=1 00:04:01.786 --rc genhtml_legend=1 00:04:01.786 --rc geninfo_all_blocks=1 00:04:01.786 --rc geninfo_unexecuted_blocks=1 00:04:01.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:01.786 ' 00:04:01.786 12:34:31 -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:01.786 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:01.786 --rc genhtml_branch_coverage=1 00:04:01.786 --rc genhtml_function_coverage=1 00:04:01.786 --rc genhtml_legend=1 00:04:01.786 --rc geninfo_all_blocks=1 00:04:01.786 --rc geninfo_unexecuted_blocks=1 00:04:01.786 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:01.786 ' 00:04:01.786 12:34:31 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:04:01.786 12:34:31 -- nvmf/common.sh@7 -- # uname -s 00:04:01.786 12:34:31 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:01.786 12:34:31 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:01.786 12:34:31 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:01.786 12:34:31 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:01.786 12:34:31 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:01.786 12:34:31 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:01.786 12:34:31 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:01.787 12:34:31 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:01.787 12:34:31 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:01.787 12:34:31 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:01.787 12:34:31 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809f3706-e051-e711-906e-0017a4403562 00:04:01.787 12:34:31 -- nvmf/common.sh@18 -- # NVME_HOSTID=809f3706-e051-e711-906e-0017a4403562 00:04:01.787 12:34:31 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:01.787 12:34:31 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:01.787 12:34:31 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:01.787 12:34:31 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:01.787 12:34:31 -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:04:01.787 12:34:31 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:01.787 12:34:31 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:01.787 12:34:31 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:01.787 12:34:31 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:01.787 12:34:31 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:01.787 12:34:31 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:01.787 12:34:31 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:01.787 12:34:31 -- paths/export.sh@5 -- # export PATH 00:04:01.787 12:34:31 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:01.787 12:34:31 -- nvmf/common.sh@51 -- # : 0 00:04:01.787 12:34:31 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:01.787 12:34:31 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:01.787 12:34:31 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:01.787 12:34:31 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:01.787 12:34:31 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:01.787 12:34:31 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:01.787 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:01.787 12:34:31 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:01.787 12:34:31 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:01.787 12:34:31 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:01.787 12:34:31 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:01.787 12:34:31 -- spdk/autotest.sh@32 -- # uname -s 00:04:01.787 12:34:31 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:01.787 12:34:31 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:01.787 12:34:31 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:01.787 12:34:31 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:04:01.787 12:34:31 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:04:01.787 12:34:31 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:01.787 12:34:31 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:01.787 12:34:31 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:01.787 12:34:31 -- spdk/autotest.sh@48 -- # udevadm_pid=554385 00:04:01.787 12:34:31 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:01.787 12:34:31 -- pm/common@17 -- # local monitor 00:04:01.787 12:34:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.787 12:34:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.787 12:34:31 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:01.787 12:34:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.787 12:34:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:01.787 12:34:31 -- pm/common@25 -- # sleep 1 00:04:01.787 12:34:31 -- pm/common@21 -- # date +%s 00:04:01.787 12:34:31 -- pm/common@21 -- # date +%s 00:04:01.787 12:34:31 -- pm/common@21 -- # date +%s 00:04:01.787 12:34:31 -- pm/common@21 -- # date +%s 00:04:01.787 12:34:31 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732793671 00:04:01.787 12:34:31 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732793671 00:04:01.787 12:34:31 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732793671 00:04:01.787 12:34:31 -- pm/common@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1732793671 00:04:01.787 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732793671_collect-vmstat.pm.log 00:04:01.787 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732793671_collect-cpu-load.pm.log 00:04:01.787 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732793671_collect-cpu-temp.pm.log 00:04:01.787 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1732793671_collect-bmc-pm.bmc.pm.log 00:04:02.724 12:34:32 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:02.724 12:34:32 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:02.724 12:34:32 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:02.724 12:34:32 -- common/autotest_common.sh@10 -- # set +x 00:04:02.724 12:34:32 -- spdk/autotest.sh@59 -- # create_test_list 00:04:02.724 12:34:32 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:02.724 12:34:32 -- common/autotest_common.sh@10 -- # set +x 00:04:02.724 12:34:32 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:04:02.724 12:34:32 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:02.725 12:34:32 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:02.725 12:34:32 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:04:02.725 12:34:32 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:04:02.725 12:34:32 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:02.725 12:34:32 -- common/autotest_common.sh@1457 -- # uname 00:04:02.725 12:34:32 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:02.725 12:34:32 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:02.725 12:34:32 -- common/autotest_common.sh@1477 -- # uname 00:04:02.725 12:34:32 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:02.725 12:34:32 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:02.725 12:34:32 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:04:02.983 lcov: LCOV version 1.15 00:04:02.983 12:34:32 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:04:09.548 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:10.926 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcno 00:04:19.045 12:34:48 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:19.045 12:34:48 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:19.045 12:34:48 -- common/autotest_common.sh@10 -- # set +x 00:04:19.045 12:34:48 -- spdk/autotest.sh@78 -- # rm -f 00:04:19.045 12:34:48 -- spdk/autotest.sh@81 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:21.584 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:04:21.584 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:04:21.584 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:04:21.844 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:04:22.104 12:34:51 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:22.104 12:34:51 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:22.104 12:34:51 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:22.104 12:34:51 -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:22.104 12:34:51 -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:22.104 12:34:51 -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:22.104 12:34:51 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:22.104 12:34:51 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:22.104 12:34:51 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:22.104 12:34:51 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:22.104 12:34:51 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:22.104 12:34:51 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:22.104 12:34:51 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:22.104 12:34:51 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:22.104 12:34:51 -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:22.104 No valid GPT data, bailing 00:04:22.104 12:34:52 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:22.104 12:34:52 -- scripts/common.sh@394 -- # pt= 00:04:22.104 12:34:52 -- scripts/common.sh@395 -- # return 1 00:04:22.104 12:34:52 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:22.104 1+0 records in 00:04:22.104 1+0 records out 00:04:22.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00457153 s, 229 MB/s 00:04:22.104 12:34:52 -- spdk/autotest.sh@105 -- # sync 00:04:22.104 12:34:52 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:22.104 12:34:52 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:22.104 12:34:52 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:27.381 12:34:56 -- spdk/autotest.sh@111 -- # uname -s 00:04:27.381 12:34:56 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:27.381 12:34:56 -- spdk/autotest.sh@111 -- # [[ 1 -eq 1 ]] 00:04:27.381 12:34:56 -- spdk/autotest.sh@112 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:27.381 12:34:56 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:27.381 12:34:56 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:27.381 12:34:56 -- common/autotest_common.sh@10 -- # set +x 00:04:27.381 ************************************ 00:04:27.381 START TEST setup.sh 00:04:27.381 ************************************ 00:04:27.381 12:34:56 setup.sh -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:04:27.381 * Looking for test storage... 00:04:27.381 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:27.381 12:34:56 setup.sh -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:27.381 12:34:56 setup.sh -- common/autotest_common.sh@1693 -- # lcov --version 00:04:27.381 12:34:56 setup.sh -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@336 -- # IFS=.-: 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@336 -- # read -ra ver1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@337 -- # IFS=.-: 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@337 -- # read -ra ver2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@338 -- # local 'op=<' 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@340 -- # ver1_l=2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@341 -- # ver2_l=1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@344 -- # case "$op" in 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@345 -- # : 1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@365 -- # decimal 1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@353 -- # local d=1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@355 -- # echo 1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@365 -- # ver1[v]=1 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@366 -- # decimal 2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@353 -- # local d=2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@355 -- # echo 2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@366 -- # ver2[v]=2 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:27.381 12:34:57 setup.sh -- scripts/common.sh@368 -- # return 0 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:27.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.381 --rc genhtml_branch_coverage=1 00:04:27.381 --rc genhtml_function_coverage=1 00:04:27.381 --rc genhtml_legend=1 00:04:27.381 --rc geninfo_all_blocks=1 00:04:27.381 --rc geninfo_unexecuted_blocks=1 00:04:27.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.381 ' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:27.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.381 --rc genhtml_branch_coverage=1 00:04:27.381 --rc genhtml_function_coverage=1 00:04:27.381 --rc genhtml_legend=1 00:04:27.381 --rc geninfo_all_blocks=1 00:04:27.381 --rc geninfo_unexecuted_blocks=1 00:04:27.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.381 ' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:27.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.381 --rc genhtml_branch_coverage=1 00:04:27.381 --rc genhtml_function_coverage=1 00:04:27.381 --rc genhtml_legend=1 00:04:27.381 --rc geninfo_all_blocks=1 00:04:27.381 --rc geninfo_unexecuted_blocks=1 00:04:27.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.381 ' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:27.381 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.381 --rc genhtml_branch_coverage=1 00:04:27.381 --rc genhtml_function_coverage=1 00:04:27.381 --rc genhtml_legend=1 00:04:27.381 --rc geninfo_all_blocks=1 00:04:27.381 --rc geninfo_unexecuted_blocks=1 00:04:27.381 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.381 ' 00:04:27.381 12:34:57 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:27.381 12:34:57 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:27.381 12:34:57 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:27.381 12:34:57 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:27.381 ************************************ 00:04:27.381 START TEST acl 00:04:27.381 ************************************ 00:04:27.381 12:34:57 setup.sh.acl -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:04:27.381 * Looking for test storage... 00:04:27.381 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:27.381 12:34:57 setup.sh.acl -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:27.381 12:34:57 setup.sh.acl -- common/autotest_common.sh@1693 -- # lcov --version 00:04:27.381 12:34:57 setup.sh.acl -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:27.381 12:34:57 setup.sh.acl -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@336 -- # IFS=.-: 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@336 -- # read -ra ver1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@337 -- # IFS=.-: 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@337 -- # read -ra ver2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@338 -- # local 'op=<' 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@340 -- # ver1_l=2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@341 -- # ver2_l=1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@344 -- # case "$op" in 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@345 -- # : 1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@365 -- # decimal 1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@353 -- # local d=1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@355 -- # echo 1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@365 -- # ver1[v]=1 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@366 -- # decimal 2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@353 -- # local d=2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@355 -- # echo 2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@366 -- # ver2[v]=2 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:27.382 12:34:57 setup.sh.acl -- scripts/common.sh@368 -- # return 0 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:27.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.382 --rc genhtml_branch_coverage=1 00:04:27.382 --rc genhtml_function_coverage=1 00:04:27.382 --rc genhtml_legend=1 00:04:27.382 --rc geninfo_all_blocks=1 00:04:27.382 --rc geninfo_unexecuted_blocks=1 00:04:27.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.382 ' 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:27.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.382 --rc genhtml_branch_coverage=1 00:04:27.382 --rc genhtml_function_coverage=1 00:04:27.382 --rc genhtml_legend=1 00:04:27.382 --rc geninfo_all_blocks=1 00:04:27.382 --rc geninfo_unexecuted_blocks=1 00:04:27.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.382 ' 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:27.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.382 --rc genhtml_branch_coverage=1 00:04:27.382 --rc genhtml_function_coverage=1 00:04:27.382 --rc genhtml_legend=1 00:04:27.382 --rc geninfo_all_blocks=1 00:04:27.382 --rc geninfo_unexecuted_blocks=1 00:04:27.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.382 ' 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:27.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:27.382 --rc genhtml_branch_coverage=1 00:04:27.382 --rc genhtml_function_coverage=1 00:04:27.382 --rc genhtml_legend=1 00:04:27.382 --rc geninfo_all_blocks=1 00:04:27.382 --rc geninfo_unexecuted_blocks=1 00:04:27.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:27.382 ' 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1658 -- # local nvme bdf 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:27.382 12:34:57 setup.sh.acl -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:27.382 12:34:57 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:27.382 12:34:57 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:27.382 12:34:57 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:30.673 12:35:00 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:30.673 12:35:00 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:30.673 12:35:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:30.673 12:35:00 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:30.673 12:35:00 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:30.673 12:35:00 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:33.209 Hugepages 00:04:33.209 node hugesize free / total 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 00:04:33.209 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:02 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:33.209 12:35:03 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:33.210 12:35:03 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:33.210 12:35:03 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:33.210 12:35:03 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:33.210 12:35:03 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:33.210 ************************************ 00:04:33.210 START TEST denied 00:04:33.210 ************************************ 00:04:33.210 12:35:03 setup.sh.acl.denied -- common/autotest_common.sh@1129 -- # denied 00:04:33.210 12:35:03 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:33.210 12:35:03 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:33.210 12:35:03 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:33.210 12:35:03 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.210 12:35:03 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:36.501 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:36.501 12:35:06 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:41.774 00:04:41.774 real 0m7.757s 00:04:41.774 user 0m2.349s 00:04:41.774 sys 0m4.741s 00:04:41.774 12:35:10 setup.sh.acl.denied -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:41.774 12:35:10 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:41.774 ************************************ 00:04:41.774 END TEST denied 00:04:41.774 ************************************ 00:04:41.774 12:35:10 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:41.774 12:35:10 setup.sh.acl -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:41.774 12:35:10 setup.sh.acl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:41.774 12:35:10 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:41.774 ************************************ 00:04:41.774 START TEST allowed 00:04:41.774 ************************************ 00:04:41.774 12:35:10 setup.sh.acl.allowed -- common/autotest_common.sh@1129 -- # allowed 00:04:41.774 12:35:10 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:41.774 12:35:10 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:41.774 12:35:10 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:41.774 12:35:10 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:41.774 12:35:10 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:48.344 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:48.344 12:35:17 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:48.344 12:35:17 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:48.344 12:35:17 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:48.344 12:35:17 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:48.344 12:35:17 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:50.880 00:04:50.880 real 0m9.956s 00:04:50.880 user 0m2.337s 00:04:50.880 sys 0m4.463s 00:04:50.880 12:35:20 setup.sh.acl.allowed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:50.880 12:35:20 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:50.880 ************************************ 00:04:50.880 END TEST allowed 00:04:50.880 ************************************ 00:04:50.880 00:04:50.880 real 0m23.862s 00:04:50.880 user 0m6.829s 00:04:50.880 sys 0m13.295s 00:04:50.880 12:35:20 setup.sh.acl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:50.880 12:35:20 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:50.880 ************************************ 00:04:50.880 END TEST acl 00:04:50.880 ************************************ 00:04:51.140 12:35:21 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:51.140 12:35:21 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:51.140 12:35:21 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:51.140 12:35:21 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:51.140 ************************************ 00:04:51.140 START TEST hugepages 00:04:51.140 ************************************ 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:04:51.140 * Looking for test storage... 00:04:51.140 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lcov --version 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@336 -- # IFS=.-: 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@336 -- # read -ra ver1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@337 -- # IFS=.-: 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@337 -- # read -ra ver2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@338 -- # local 'op=<' 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@340 -- # ver1_l=2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@341 -- # ver2_l=1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@344 -- # case "$op" in 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@345 -- # : 1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@365 -- # decimal 1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@365 -- # ver1[v]=1 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@366 -- # decimal 2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@353 -- # local d=2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@355 -- # echo 2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@366 -- # ver2[v]=2 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:51.140 12:35:21 setup.sh.hugepages -- scripts/common.sh@368 -- # return 0 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:04:51.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.140 --rc genhtml_branch_coverage=1 00:04:51.140 --rc genhtml_function_coverage=1 00:04:51.140 --rc genhtml_legend=1 00:04:51.140 --rc geninfo_all_blocks=1 00:04:51.140 --rc geninfo_unexecuted_blocks=1 00:04:51.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.140 ' 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:04:51.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.140 --rc genhtml_branch_coverage=1 00:04:51.140 --rc genhtml_function_coverage=1 00:04:51.140 --rc genhtml_legend=1 00:04:51.140 --rc geninfo_all_blocks=1 00:04:51.140 --rc geninfo_unexecuted_blocks=1 00:04:51.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.140 ' 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:04:51.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.140 --rc genhtml_branch_coverage=1 00:04:51.140 --rc genhtml_function_coverage=1 00:04:51.140 --rc genhtml_legend=1 00:04:51.140 --rc geninfo_all_blocks=1 00:04:51.140 --rc geninfo_unexecuted_blocks=1 00:04:51.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.140 ' 00:04:51.140 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:04:51.140 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:51.140 --rc genhtml_branch_coverage=1 00:04:51.140 --rc genhtml_function_coverage=1 00:04:51.140 --rc genhtml_legend=1 00:04:51.140 --rc geninfo_all_blocks=1 00:04:51.140 --rc geninfo_unexecuted_blocks=1 00:04:51.140 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:51.140 ' 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 70618528 kB' 'MemAvailable: 75531860 kB' 'Buffers: 10116 kB' 'Cached: 14759776 kB' 'SwapCached: 0 kB' 'Active: 11396144 kB' 'Inactive: 4039056 kB' 'Active(anon): 10203160 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 668672 kB' 'Mapped: 174312 kB' 'Shmem: 9537852 kB' 'KReclaimable: 517228 kB' 'Slab: 1046532 kB' 'SReclaimable: 517228 kB' 'SUnreclaim: 529304 kB' 'KernelStack: 16064 kB' 'PageTables: 8748 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52434204 kB' 'Committed_AS: 11531200 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.140 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.141 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:51.142 12:35:21 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGEMEM 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGENODE 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v NRHUGE 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@197 -- # get_nodes 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@26 -- # local node 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@198 -- # clear_hp 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:04:51.402 12:35:21 setup.sh.hugepages -- setup/hugepages.sh@200 -- # run_test single_node_setup single_node_setup 00:04:51.402 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:51.402 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:51.402 12:35:21 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:51.402 ************************************ 00:04:51.402 START TEST single_node_setup 00:04:51.402 ************************************ 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1129 -- # single_node_setup 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@135 -- # get_test_nr_hugepages 2097152 0 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@48 -- # local size=2097152 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@50 -- # shift 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # node_ids=('0') 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@51 -- # local node_ids 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@61 -- # local user_nodes 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@72 -- # return 0 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # NRHUGE=1024 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # HUGENODE=0 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@136 -- # setup output 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:51.402 12:35:21 setup.sh.hugepages.single_node_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:54.694 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:54.694 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:57.987 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@137 -- # verify_nr_hugepages 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@88 -- # local node 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@89 -- # local sorted_t 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@90 -- # local sorted_s 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@91 -- # local surp 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@92 -- # local resv 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@93 -- # local anon 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72794508 kB' 'MemAvailable: 77707792 kB' 'Buffers: 10116 kB' 'Cached: 14759908 kB' 'SwapCached: 0 kB' 'Active: 11399096 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206112 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670908 kB' 'Mapped: 174092 kB' 'Shmem: 9537984 kB' 'KReclaimable: 517180 kB' 'Slab: 1045904 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528724 kB' 'KernelStack: 16112 kB' 'PageTables: 8592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11535660 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199972 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.987 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.988 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@96 -- # anon=0 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.989 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72796160 kB' 'MemAvailable: 77709444 kB' 'Buffers: 10116 kB' 'Cached: 14759912 kB' 'SwapCached: 0 kB' 'Active: 11398540 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205556 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670440 kB' 'Mapped: 174060 kB' 'Shmem: 9537988 kB' 'KReclaimable: 517180 kB' 'Slab: 1045888 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528708 kB' 'KernelStack: 16096 kB' 'PageTables: 8476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11535680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199940 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.990 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:27 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.991 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@98 -- # surp=0 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.992 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72795564 kB' 'MemAvailable: 77708848 kB' 'Buffers: 10116 kB' 'Cached: 14759928 kB' 'SwapCached: 0 kB' 'Active: 11398632 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205648 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670492 kB' 'Mapped: 174060 kB' 'Shmem: 9538004 kB' 'KReclaimable: 517180 kB' 'Slab: 1045976 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528796 kB' 'KernelStack: 16080 kB' 'PageTables: 8812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11534324 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199972 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.993 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.994 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@99 -- # resv=0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:04:57.995 nr_hugepages=1024 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:04:57.995 resv_hugepages=0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:04:57.995 surplus_hugepages=0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:04:57.995 anon_hugepages=0 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node= 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.995 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72794988 kB' 'MemAvailable: 77708272 kB' 'Buffers: 10116 kB' 'Cached: 14759928 kB' 'SwapCached: 0 kB' 'Active: 11398788 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205804 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670608 kB' 'Mapped: 174060 kB' 'Shmem: 9538004 kB' 'KReclaimable: 517180 kB' 'Slab: 1045976 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528796 kB' 'KernelStack: 16160 kB' 'PageTables: 8860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11536656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200052 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.996 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.997 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 1024 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@111 -- # get_nodes 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@26 -- # local node 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@31 -- # no_nodes=2 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@18 -- # local node=0 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@19 -- # local var val 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 33128408 kB' 'MemUsed: 14936528 kB' 'SwapCached: 0 kB' 'Active: 8355872 kB' 'Inactive: 3549192 kB' 'Active(anon): 7504760 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11555536 kB' 'Mapped: 143456 kB' 'AnonPages: 352264 kB' 'Shmem: 7155232 kB' 'KernelStack: 7704 kB' 'PageTables: 5072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 480628 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284200 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.998 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:57.999 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # continue 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # echo 0 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/common.sh@33 -- # return 0 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:04:58.000 node0=1024 expecting 1024 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:04:58.000 00:04:58.000 real 0m6.763s 00:04:58.000 user 0m1.311s 00:04:58.000 sys 0m2.364s 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:58.000 12:35:28 setup.sh.hugepages.single_node_setup -- common/autotest_common.sh@10 -- # set +x 00:04:58.000 ************************************ 00:04:58.000 END TEST single_node_setup 00:04:58.000 ************************************ 00:04:58.260 12:35:28 setup.sh.hugepages -- setup/hugepages.sh@201 -- # run_test even_2G_alloc even_2G_alloc 00:04:58.260 12:35:28 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:58.260 12:35:28 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:58.260 12:35:28 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:58.260 ************************************ 00:04:58.260 START TEST even_2G_alloc 00:04:58.260 ************************************ 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1129 -- # even_2G_alloc 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@142 -- # get_test_nr_hugepages 2097152 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 512 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 1 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # : 0 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # NRHUGE=1024 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@143 -- # setup output 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:58.260 12:35:28 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:01.563 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:01.563 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:01.563 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@144 -- # verify_nr_hugepages 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@88 -- # local node 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72801160 kB' 'MemAvailable: 77714444 kB' 'Buffers: 10116 kB' 'Cached: 14760060 kB' 'SwapCached: 0 kB' 'Active: 11398600 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205616 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670772 kB' 'Mapped: 174156 kB' 'Shmem: 9538136 kB' 'KReclaimable: 517180 kB' 'Slab: 1046576 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529396 kB' 'KernelStack: 16016 kB' 'PageTables: 8672 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11534864 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200132 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.563 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.564 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72797644 kB' 'MemAvailable: 77710928 kB' 'Buffers: 10116 kB' 'Cached: 14760064 kB' 'SwapCached: 0 kB' 'Active: 11402248 kB' 'Inactive: 4039056 kB' 'Active(anon): 10209264 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 674512 kB' 'Mapped: 174568 kB' 'Shmem: 9538140 kB' 'KReclaimable: 517180 kB' 'Slab: 1046524 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529344 kB' 'KernelStack: 15984 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11538712 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.565 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.566 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72794020 kB' 'MemAvailable: 77707304 kB' 'Buffers: 10116 kB' 'Cached: 14760080 kB' 'SwapCached: 0 kB' 'Active: 11403904 kB' 'Inactive: 4039056 kB' 'Active(anon): 10210920 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 675596 kB' 'Mapped: 174960 kB' 'Shmem: 9538156 kB' 'KReclaimable: 517180 kB' 'Slab: 1046524 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529344 kB' 'KernelStack: 15984 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11540064 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.567 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.568 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:01.569 nr_hugepages=1024 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:01.569 resv_hugepages=0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:01.569 surplus_hugepages=0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:01.569 anon_hugepages=0 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72801884 kB' 'MemAvailable: 77715168 kB' 'Buffers: 10116 kB' 'Cached: 14760120 kB' 'SwapCached: 0 kB' 'Active: 11397932 kB' 'Inactive: 4039056 kB' 'Active(anon): 10204948 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670064 kB' 'Mapped: 174064 kB' 'Shmem: 9538196 kB' 'KReclaimable: 517180 kB' 'Slab: 1046524 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529344 kB' 'KernelStack: 15968 kB' 'PageTables: 8516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11533964 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.569 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.570 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@26 -- # local node 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 34169428 kB' 'MemUsed: 13895508 kB' 'SwapCached: 0 kB' 'Active: 8354276 kB' 'Inactive: 3549192 kB' 'Active(anon): 7503164 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11555648 kB' 'Mapped: 143472 kB' 'AnonPages: 350972 kB' 'Shmem: 7155344 kB' 'KernelStack: 7640 kB' 'PageTables: 4852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 481052 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284624 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.571 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220568 kB' 'MemFree: 38632456 kB' 'MemUsed: 5588112 kB' 'SwapCached: 0 kB' 'Active: 3044032 kB' 'Inactive: 489864 kB' 'Active(anon): 2702160 kB' 'Inactive(anon): 0 kB' 'Active(file): 341872 kB' 'Inactive(file): 489864 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3214592 kB' 'Mapped: 30592 kB' 'AnonPages: 319460 kB' 'Shmem: 2382856 kB' 'KernelStack: 8328 kB' 'PageTables: 3664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 320752 kB' 'Slab: 565472 kB' 'SReclaimable: 320752 kB' 'SUnreclaim: 244720 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.572 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:05:01.573 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:01.574 node0=512 expecting 512 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:01.574 node1=512 expecting 512 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@129 -- # [[ 512 == \5\1\2 ]] 00:05:01.574 00:05:01.574 real 0m3.295s 00:05:01.574 user 0m1.290s 00:05:01.574 sys 0m2.066s 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:01.574 12:35:31 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:01.574 ************************************ 00:05:01.574 END TEST even_2G_alloc 00:05:01.574 ************************************ 00:05:01.574 12:35:31 setup.sh.hugepages -- setup/hugepages.sh@202 -- # run_test odd_alloc odd_alloc 00:05:01.574 12:35:31 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:01.574 12:35:31 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:01.574 12:35:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:01.574 ************************************ 00:05:01.574 START TEST odd_alloc 00:05:01.574 ************************************ 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1129 -- # odd_alloc 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@149 -- # get_test_nr_hugepages 2098176 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@48 -- # local size=2098176 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1025 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1025 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=512 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 513 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=513 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # HUGEMEM=2049 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@150 -- # setup output 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.574 12:35:31 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:04.118 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:04.118 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:04.118 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@151 -- # verify_nr_hugepages 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@88 -- # local node 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72817676 kB' 'MemAvailable: 77730960 kB' 'Buffers: 10116 kB' 'Cached: 14760212 kB' 'SwapCached: 0 kB' 'Active: 11397076 kB' 'Inactive: 4039056 kB' 'Active(anon): 10204092 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 669128 kB' 'Mapped: 173200 kB' 'Shmem: 9538288 kB' 'KReclaimable: 517180 kB' 'Slab: 1046436 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529256 kB' 'KernelStack: 15904 kB' 'PageTables: 8228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481756 kB' 'Committed_AS: 11527340 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199972 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.118 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:33 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.119 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72818684 kB' 'MemAvailable: 77731968 kB' 'Buffers: 10116 kB' 'Cached: 14760216 kB' 'SwapCached: 0 kB' 'Active: 11397352 kB' 'Inactive: 4039056 kB' 'Active(anon): 10204368 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 669436 kB' 'Mapped: 173160 kB' 'Shmem: 9538292 kB' 'KReclaimable: 517180 kB' 'Slab: 1046456 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529276 kB' 'KernelStack: 15952 kB' 'PageTables: 8380 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481756 kB' 'Committed_AS: 11527356 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199940 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.120 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.121 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72818684 kB' 'MemAvailable: 77731968 kB' 'Buffers: 10116 kB' 'Cached: 14760232 kB' 'SwapCached: 0 kB' 'Active: 11397116 kB' 'Inactive: 4039056 kB' 'Active(anon): 10204132 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 669180 kB' 'Mapped: 173160 kB' 'Shmem: 9538308 kB' 'KReclaimable: 517180 kB' 'Slab: 1046456 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529276 kB' 'KernelStack: 15936 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481756 kB' 'Committed_AS: 11527376 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199940 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.122 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1025 00:05:04.123 nr_hugepages=1025 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:04.123 resv_hugepages=0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:04.123 surplus_hugepages=0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:04.123 anon_hugepages=0 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@106 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@108 -- # (( 1025 == nr_hugepages )) 00:05:04.123 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72818684 kB' 'MemAvailable: 77731968 kB' 'Buffers: 10116 kB' 'Cached: 14760232 kB' 'SwapCached: 0 kB' 'Active: 11397652 kB' 'Inactive: 4039056 kB' 'Active(anon): 10204668 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 669716 kB' 'Mapped: 173160 kB' 'Shmem: 9538308 kB' 'KReclaimable: 517180 kB' 'Slab: 1046456 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529276 kB' 'KernelStack: 15952 kB' 'PageTables: 8384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53481756 kB' 'Committed_AS: 11527400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199940 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.124 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@26 -- # local node 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=513 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:04.125 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 34173140 kB' 'MemUsed: 13891796 kB' 'SwapCached: 0 kB' 'Active: 8353816 kB' 'Inactive: 3549192 kB' 'Active(anon): 7502704 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11555728 kB' 'Mapped: 142736 kB' 'AnonPages: 350448 kB' 'Shmem: 7155424 kB' 'KernelStack: 7592 kB' 'PageTables: 4648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 480960 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284532 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.126 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220568 kB' 'MemFree: 38646332 kB' 'MemUsed: 5574236 kB' 'SwapCached: 0 kB' 'Active: 3043976 kB' 'Inactive: 489864 kB' 'Active(anon): 2702104 kB' 'Inactive(anon): 0 kB' 'Active(file): 341872 kB' 'Inactive(file): 489864 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3214620 kB' 'Mapped: 30424 kB' 'AnonPages: 319444 kB' 'Shmem: 2382884 kB' 'KernelStack: 8344 kB' 'PageTables: 3688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 320752 kB' 'Slab: 565496 kB' 'SReclaimable: 320752 kB' 'SUnreclaim: 244744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.127 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node0=513 expecting 513' 00:05:04.128 node0=513 expecting 513 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # echo 'node1=512 expecting 512' 00:05:04.128 node1=512 expecting 512 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@129 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:04.128 00:05:04.128 real 0m2.621s 00:05:04.128 user 0m0.888s 00:05:04.128 sys 0m1.612s 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:04.128 12:35:34 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:04.128 ************************************ 00:05:04.128 END TEST odd_alloc 00:05:04.128 ************************************ 00:05:04.128 12:35:34 setup.sh.hugepages -- setup/hugepages.sh@203 -- # run_test custom_alloc custom_alloc 00:05:04.128 12:35:34 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:04.128 12:35:34 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:04.128 12:35:34 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:04.128 ************************************ 00:05:04.128 START TEST custom_alloc 00:05:04.128 ************************************ 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1129 -- # custom_alloc 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@157 -- # local IFS=, 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@159 -- # local node 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # nodes_hp=() 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@160 -- # local nodes_hp 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@162 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:04.128 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@164 -- # get_test_nr_hugepages 1048576 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=1048576 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=512 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=512 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 0 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 256 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 1 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # nodes_test[_no_nodes - 1]=256 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # : 0 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@80 -- # (( _no_nodes > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@165 -- # nodes_hp[0]=512 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@166 -- # (( 2 > 1 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # get_test_nr_hugepages 2097152 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # (( 1 > 1 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 1 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@168 -- # nodes_hp[1]=1024 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@171 -- # for node in "${!nodes_hp[@]}" 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@173 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # get_test_nr_hugepages_per_node 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # user_nodes=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@68 -- # (( 0 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@73 -- # (( 2 > 0 )) 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=512 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # nodes_test[_no_nodes]=1024 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@77 -- # return 0 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # setup output 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:04.129 12:35:34 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:07.430 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:07.430 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:07.430 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nr_hugepages=1536 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # verify_nr_hugepages 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@88 -- # local node 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 71768420 kB' 'MemAvailable: 76681704 kB' 'Buffers: 10116 kB' 'Cached: 14760360 kB' 'SwapCached: 0 kB' 'Active: 11399148 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206164 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670936 kB' 'Mapped: 173260 kB' 'Shmem: 9538436 kB' 'KReclaimable: 517180 kB' 'Slab: 1046648 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529468 kB' 'KernelStack: 16160 kB' 'PageTables: 8880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958492 kB' 'Committed_AS: 11530644 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200148 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 71768056 kB' 'MemAvailable: 76681340 kB' 'Buffers: 10116 kB' 'Cached: 14760360 kB' 'SwapCached: 0 kB' 'Active: 11398912 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205928 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670688 kB' 'Mapped: 173180 kB' 'Shmem: 9538436 kB' 'KReclaimable: 517180 kB' 'Slab: 1046644 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529464 kB' 'KernelStack: 16064 kB' 'PageTables: 8976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958492 kB' 'Committed_AS: 11527872 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 71768136 kB' 'MemAvailable: 76681420 kB' 'Buffers: 10116 kB' 'Cached: 14760396 kB' 'SwapCached: 0 kB' 'Active: 11398676 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205692 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670484 kB' 'Mapped: 173168 kB' 'Shmem: 9538472 kB' 'KReclaimable: 517180 kB' 'Slab: 1046612 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529432 kB' 'KernelStack: 16016 kB' 'PageTables: 8584 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958492 kB' 'Committed_AS: 11528096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 199988 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1536 00:05:07.436 nr_hugepages=1536 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:07.436 resv_hugepages=0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:07.436 surplus_hugepages=0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:07.436 anon_hugepages=0 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@106 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@108 -- # (( 1536 == nr_hugepages )) 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 71768744 kB' 'MemAvailable: 76682028 kB' 'Buffers: 10116 kB' 'Cached: 14760404 kB' 'SwapCached: 0 kB' 'Active: 11398332 kB' 'Inactive: 4039056 kB' 'Active(anon): 10205348 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670136 kB' 'Mapped: 173168 kB' 'Shmem: 9538480 kB' 'KReclaimable: 517180 kB' 'Slab: 1046612 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529432 kB' 'KernelStack: 16016 kB' 'PageTables: 8580 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52958492 kB' 'Committed_AS: 11528116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200004 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@26 -- # local node 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=512 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 34166636 kB' 'MemUsed: 13898300 kB' 'SwapCached: 0 kB' 'Active: 8354224 kB' 'Inactive: 3549192 kB' 'Active(anon): 7503112 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11555896 kB' 'Mapped: 142744 kB' 'AnonPages: 350676 kB' 'Shmem: 7155592 kB' 'KernelStack: 7688 kB' 'PageTables: 4944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 481000 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284572 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 1 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44220568 kB' 'MemFree: 37602808 kB' 'MemUsed: 6617760 kB' 'SwapCached: 0 kB' 'Active: 3044140 kB' 'Inactive: 489864 kB' 'Active(anon): 2702268 kB' 'Inactive(anon): 0 kB' 'Active(file): 341872 kB' 'Inactive(file): 489864 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3214648 kB' 'Mapped: 30424 kB' 'AnonPages: 319464 kB' 'Shmem: 2382912 kB' 'KernelStack: 8328 kB' 'PageTables: 3636 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 320752 kB' 'Slab: 565612 kB' 'SReclaimable: 320752 kB' 'SUnreclaim: 244860 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.699 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node0=512 expecting 512' 00:05:07.700 node0=512 expecting 512 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # echo 'node1=1024 expecting 1024' 00:05:07.700 node1=1024 expecting 1024 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@129 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:07.700 00:05:07.700 real 0m3.358s 00:05:07.700 user 0m1.298s 00:05:07.700 sys 0m2.149s 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:07.700 12:35:37 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:07.700 ************************************ 00:05:07.700 END TEST custom_alloc 00:05:07.700 ************************************ 00:05:07.700 12:35:37 setup.sh.hugepages -- setup/hugepages.sh@204 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:07.700 12:35:37 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:07.700 12:35:37 setup.sh.hugepages -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:07.700 12:35:37 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:07.700 ************************************ 00:05:07.700 START TEST no_shrink_alloc 00:05:07.700 ************************************ 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1129 -- # no_shrink_alloc 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@185 -- # get_test_nr_hugepages 2097152 0 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@48 -- # local size=2097152 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # (( 2 > 1 )) 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # shift 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # node_ids=('0') 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # local node_ids 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@54 -- # (( size >= default_hugepages )) 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@56 -- # nr_hugepages=1024 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # get_test_nr_hugepages_per_node 0 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # user_nodes=('0') 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@61 -- # local user_nodes 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@63 -- # local _nr_hugepages=1024 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _no_nodes=2 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # nodes_test=() 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@66 -- # local -g nodes_test 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@68 -- # (( 1 > 0 )) 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # for _no_nodes in "${user_nodes[@]}" 00:05:07.700 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # nodes_test[_no_nodes]=1024 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@72 -- # return 0 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # NRHUGE=1024 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # HUGENODE=0 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@188 -- # setup output 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.701 12:35:37 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.232 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:10.232 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:10.232 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@189 -- # verify_nr_hugepages 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72828008 kB' 'MemAvailable: 77741292 kB' 'Buffers: 10116 kB' 'Cached: 14760504 kB' 'SwapCached: 0 kB' 'Active: 11399880 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206896 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 671696 kB' 'Mapped: 173192 kB' 'Shmem: 9538580 kB' 'KReclaimable: 517180 kB' 'Slab: 1046700 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529520 kB' 'KernelStack: 15920 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528096 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.232 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.233 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72829272 kB' 'MemAvailable: 77742556 kB' 'Buffers: 10116 kB' 'Cached: 14760508 kB' 'SwapCached: 0 kB' 'Active: 11399944 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206960 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 671836 kB' 'Mapped: 173180 kB' 'Shmem: 9538584 kB' 'KReclaimable: 517180 kB' 'Slab: 1046644 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529464 kB' 'KernelStack: 15904 kB' 'PageTables: 8216 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528116 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200052 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.234 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.235 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.236 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.499 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72827788 kB' 'MemAvailable: 77741072 kB' 'Buffers: 10116 kB' 'Cached: 14760536 kB' 'SwapCached: 0 kB' 'Active: 11400052 kB' 'Inactive: 4039056 kB' 'Active(anon): 10207068 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 671968 kB' 'Mapped: 173180 kB' 'Shmem: 9538612 kB' 'KReclaimable: 517180 kB' 'Slab: 1046644 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529464 kB' 'KernelStack: 15920 kB' 'PageTables: 8280 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528508 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200068 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.500 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.501 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:10.502 nr_hugepages=1024 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:10.502 resv_hugepages=0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:10.502 surplus_hugepages=0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:10.502 anon_hugepages=0 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72827284 kB' 'MemAvailable: 77740568 kB' 'Buffers: 10116 kB' 'Cached: 14760556 kB' 'SwapCached: 0 kB' 'Active: 11400204 kB' 'Inactive: 4039056 kB' 'Active(anon): 10207220 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 672024 kB' 'Mapped: 173180 kB' 'Shmem: 9538632 kB' 'KReclaimable: 517180 kB' 'Slab: 1046644 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 529464 kB' 'KernelStack: 15952 kB' 'PageTables: 8388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528528 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200084 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.502 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.503 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 33114704 kB' 'MemUsed: 14950232 kB' 'SwapCached: 0 kB' 'Active: 8354340 kB' 'Inactive: 3549192 kB' 'Active(anon): 7503228 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11556020 kB' 'Mapped: 142756 kB' 'AnonPages: 350824 kB' 'Shmem: 7155716 kB' 'KernelStack: 7624 kB' 'PageTables: 4740 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 481188 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284760 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.504 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.505 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:10.506 node0=1024 expecting 1024 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # CLEAR_HUGE=no 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # NRHUGE=512 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # HUGENODE=0 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@192 -- # setup output 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:10.506 12:35:40 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.805 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:13.805 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:13.805 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:13.805 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@194 -- # verify_nr_hugepages 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@88 -- # local node 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local sorted_t 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_s 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local surp 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local resv 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local anon 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@95 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # get_meminfo AnonHugePages 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72842912 kB' 'MemAvailable: 77756196 kB' 'Buffers: 10116 kB' 'Cached: 14760640 kB' 'SwapCached: 0 kB' 'Active: 11399628 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206644 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670860 kB' 'Mapped: 173188 kB' 'Shmem: 9538716 kB' 'KReclaimable: 517180 kB' 'Slab: 1046052 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528872 kB' 'KernelStack: 15888 kB' 'PageTables: 8516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528480 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200052 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.805 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:13.806 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # anon=0 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # get_meminfo HugePages_Surp 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72846496 kB' 'MemAvailable: 77759780 kB' 'Buffers: 10116 kB' 'Cached: 14760644 kB' 'SwapCached: 0 kB' 'Active: 11399084 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206100 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670764 kB' 'Mapped: 173196 kB' 'Shmem: 9538720 kB' 'KReclaimable: 517180 kB' 'Slab: 1046044 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528864 kB' 'KernelStack: 15904 kB' 'PageTables: 8212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11528632 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200036 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.807 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@98 -- # surp=0 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Rsvd 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.808 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72846888 kB' 'MemAvailable: 77760172 kB' 'Buffers: 10116 kB' 'Cached: 14760668 kB' 'SwapCached: 0 kB' 'Active: 11399256 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206272 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 670884 kB' 'Mapped: 173196 kB' 'Shmem: 9538744 kB' 'KReclaimable: 517180 kB' 'Slab: 1046044 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528864 kB' 'KernelStack: 15952 kB' 'PageTables: 8388 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11529024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200052 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.809 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:13.810 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # resv=0 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@101 -- # echo nr_hugepages=1024 00:05:13.811 nr_hugepages=1024 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo resv_hugepages=0 00:05:13.811 resv_hugepages=0 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo surplus_hugepages=0 00:05:13.811 surplus_hugepages=0 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo anon_hugepages=0 00:05:13.811 anon_hugepages=0 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@106 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@108 -- # (( 1024 == nr_hugepages )) 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # get_meminfo HugePages_Total 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92285504 kB' 'MemFree: 72847120 kB' 'MemAvailable: 77760404 kB' 'Buffers: 10116 kB' 'Cached: 14760688 kB' 'SwapCached: 0 kB' 'Active: 11399628 kB' 'Inactive: 4039056 kB' 'Active(anon): 10206644 kB' 'Inactive(anon): 0 kB' 'Active(file): 1192984 kB' 'Inactive(file): 4039056 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 671260 kB' 'Mapped: 173196 kB' 'Shmem: 9538764 kB' 'KReclaimable: 517180 kB' 'Slab: 1046044 kB' 'SReclaimable: 517180 kB' 'SUnreclaim: 528864 kB' 'KernelStack: 15968 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53482780 kB' 'Committed_AS: 11529048 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200052 kB' 'VmallocChunk: 0 kB' 'Percpu: 68832 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 419328 kB' 'DirectMap2M: 7645184 kB' 'DirectMap1G: 94371840 kB' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.811 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@111 -- # get_nodes 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@26 -- # local node 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=1024 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@28 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # nodes_sys[${node##*node}]=0 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@31 -- # no_nodes=2 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # (( no_nodes > 0 )) 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@114 -- # for node in "${!nodes_test[@]}" 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # (( nodes_test[node] += resv )) 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # get_meminfo HugePages_Surp 0 00:05:13.812 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48064936 kB' 'MemFree: 33125516 kB' 'MemUsed: 14939420 kB' 'SwapCached: 0 kB' 'Active: 8354796 kB' 'Inactive: 3549192 kB' 'Active(anon): 7503684 kB' 'Inactive(anon): 0 kB' 'Active(file): 851112 kB' 'Inactive(file): 3549192 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 11556120 kB' 'Mapped: 142772 kB' 'AnonPages: 351124 kB' 'Shmem: 7155816 kB' 'KernelStack: 7640 kB' 'PageTables: 4792 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 196428 kB' 'Slab: 480876 kB' 'SReclaimable: 196428 kB' 'SUnreclaim: 284448 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.813 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += 0 )) 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@125 -- # for node in "${!nodes_test[@]}" 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_t[nodes_test[node]]=1 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # sorted_s[nodes_sys[node]]=1 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # echo 'node0=1024 expecting 1024' 00:05:13.814 node0=1024 expecting 1024 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@129 -- # [[ 1024 == \1\0\2\4 ]] 00:05:13.814 00:05:13.814 real 0m6.094s 00:05:13.814 user 0m2.284s 00:05:13.814 sys 0m3.926s 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:13.814 12:35:43 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:13.814 ************************************ 00:05:13.814 END TEST no_shrink_alloc 00:05:13.814 ************************************ 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@206 -- # clear_hp 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@36 -- # local node hp 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@38 -- # for node in "${!nodes_sys[@]}" 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@40 -- # echo 0 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@44 -- # export CLEAR_HUGE=yes 00:05:13.814 12:35:43 setup.sh.hugepages -- setup/hugepages.sh@44 -- # CLEAR_HUGE=yes 00:05:13.814 00:05:13.814 real 0m22.722s 00:05:13.814 user 0m7.336s 00:05:13.814 sys 0m12.488s 00:05:13.814 12:35:43 setup.sh.hugepages -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:13.814 12:35:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:13.814 ************************************ 00:05:13.814 END TEST hugepages 00:05:13.814 ************************************ 00:05:13.814 12:35:43 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:13.814 12:35:43 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:13.814 12:35:43 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:13.814 12:35:43 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:13.814 ************************************ 00:05:13.814 START TEST driver 00:05:13.814 ************************************ 00:05:13.814 12:35:43 setup.sh.driver -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:05:14.073 * Looking for test storage... 00:05:14.073 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:14.073 12:35:43 setup.sh.driver -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:14.073 12:35:43 setup.sh.driver -- common/autotest_common.sh@1693 -- # lcov --version 00:05:14.073 12:35:43 setup.sh.driver -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:14.073 12:35:44 setup.sh.driver -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@336 -- # IFS=.-: 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@336 -- # read -ra ver1 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@337 -- # IFS=.-: 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@337 -- # read -ra ver2 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@338 -- # local 'op=<' 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@340 -- # ver1_l=2 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@341 -- # ver2_l=1 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@344 -- # case "$op" in 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@345 -- # : 1 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@365 -- # decimal 1 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@353 -- # local d=1 00:05:14.073 12:35:44 setup.sh.driver -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@355 -- # echo 1 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@365 -- # ver1[v]=1 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@366 -- # decimal 2 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@353 -- # local d=2 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@355 -- # echo 2 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@366 -- # ver2[v]=2 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:14.074 12:35:44 setup.sh.driver -- scripts/common.sh@368 -- # return 0 00:05:14.074 12:35:44 setup.sh.driver -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:14.074 12:35:44 setup.sh.driver -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:14.074 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.074 --rc genhtml_branch_coverage=1 00:05:14.074 --rc genhtml_function_coverage=1 00:05:14.074 --rc genhtml_legend=1 00:05:14.074 --rc geninfo_all_blocks=1 00:05:14.074 --rc geninfo_unexecuted_blocks=1 00:05:14.074 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.074 ' 00:05:14.074 12:35:44 setup.sh.driver -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:14.074 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.074 --rc genhtml_branch_coverage=1 00:05:14.074 --rc genhtml_function_coverage=1 00:05:14.074 --rc genhtml_legend=1 00:05:14.074 --rc geninfo_all_blocks=1 00:05:14.074 --rc geninfo_unexecuted_blocks=1 00:05:14.074 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.074 ' 00:05:14.074 12:35:44 setup.sh.driver -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:14.074 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.074 --rc genhtml_branch_coverage=1 00:05:14.074 --rc genhtml_function_coverage=1 00:05:14.074 --rc genhtml_legend=1 00:05:14.074 --rc geninfo_all_blocks=1 00:05:14.074 --rc geninfo_unexecuted_blocks=1 00:05:14.074 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.074 ' 00:05:14.074 12:35:44 setup.sh.driver -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:14.074 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:14.074 --rc genhtml_branch_coverage=1 00:05:14.074 --rc genhtml_function_coverage=1 00:05:14.074 --rc genhtml_legend=1 00:05:14.074 --rc geninfo_all_blocks=1 00:05:14.074 --rc geninfo_unexecuted_blocks=1 00:05:14.074 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:14.074 ' 00:05:14.074 12:35:44 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:14.074 12:35:44 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:14.074 12:35:44 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:18.266 12:35:48 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:18.266 12:35:48 setup.sh.driver -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:18.266 12:35:48 setup.sh.driver -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:18.266 12:35:48 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:18.266 ************************************ 00:05:18.266 START TEST guess_driver 00:05:18.266 ************************************ 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- common/autotest_common.sh@1129 -- # guess_driver 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:18.266 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 160 > 0 )) 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:18.267 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:18.267 Looking for driver=vfio-pci 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:18.267 12:35:48 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.555 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.556 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.556 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.556 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:21.556 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:21.556 12:35:51 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:24.846 12:35:54 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:29.037 00:05:29.037 real 0m10.682s 00:05:29.037 user 0m2.381s 00:05:29.037 sys 0m4.507s 00:05:29.037 12:35:58 setup.sh.driver.guess_driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:29.037 12:35:58 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:29.037 ************************************ 00:05:29.037 END TEST guess_driver 00:05:29.037 ************************************ 00:05:29.037 00:05:29.037 real 0m14.973s 00:05:29.037 user 0m3.566s 00:05:29.037 sys 0m6.839s 00:05:29.037 12:35:58 setup.sh.driver -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:29.037 12:35:58 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:29.037 ************************************ 00:05:29.037 END TEST driver 00:05:29.037 ************************************ 00:05:29.037 12:35:58 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:29.037 12:35:58 setup.sh -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:29.037 12:35:58 setup.sh -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:29.037 12:35:58 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:29.037 ************************************ 00:05:29.037 START TEST devices 00:05:29.037 ************************************ 00:05:29.037 12:35:58 setup.sh.devices -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:05:29.037 * Looking for test storage... 00:05:29.037 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:05:29.037 12:35:59 setup.sh.devices -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:05:29.037 12:35:59 setup.sh.devices -- common/autotest_common.sh@1693 -- # lcov --version 00:05:29.037 12:35:59 setup.sh.devices -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:05:29.037 12:35:59 setup.sh.devices -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@336 -- # IFS=.-: 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@336 -- # read -ra ver1 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@337 -- # IFS=.-: 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@337 -- # read -ra ver2 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@338 -- # local 'op=<' 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@340 -- # ver1_l=2 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@341 -- # ver2_l=1 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@344 -- # case "$op" in 00:05:29.037 12:35:59 setup.sh.devices -- scripts/common.sh@345 -- # : 1 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@365 -- # decimal 1 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@353 -- # local d=1 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@355 -- # echo 1 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@365 -- # ver1[v]=1 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@366 -- # decimal 2 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@353 -- # local d=2 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@355 -- # echo 2 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@366 -- # ver2[v]=2 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:29.038 12:35:59 setup.sh.devices -- scripts/common.sh@368 -- # return 0 00:05:29.038 12:35:59 setup.sh.devices -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:29.038 12:35:59 setup.sh.devices -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:05:29.038 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.038 --rc genhtml_branch_coverage=1 00:05:29.038 --rc genhtml_function_coverage=1 00:05:29.038 --rc genhtml_legend=1 00:05:29.038 --rc geninfo_all_blocks=1 00:05:29.038 --rc geninfo_unexecuted_blocks=1 00:05:29.038 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.038 ' 00:05:29.038 12:35:59 setup.sh.devices -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:05:29.038 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.038 --rc genhtml_branch_coverage=1 00:05:29.038 --rc genhtml_function_coverage=1 00:05:29.038 --rc genhtml_legend=1 00:05:29.038 --rc geninfo_all_blocks=1 00:05:29.038 --rc geninfo_unexecuted_blocks=1 00:05:29.038 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.038 ' 00:05:29.038 12:35:59 setup.sh.devices -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:05:29.038 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.038 --rc genhtml_branch_coverage=1 00:05:29.038 --rc genhtml_function_coverage=1 00:05:29.038 --rc genhtml_legend=1 00:05:29.038 --rc geninfo_all_blocks=1 00:05:29.038 --rc geninfo_unexecuted_blocks=1 00:05:29.038 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.038 ' 00:05:29.038 12:35:59 setup.sh.devices -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:05:29.038 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.038 --rc genhtml_branch_coverage=1 00:05:29.038 --rc genhtml_function_coverage=1 00:05:29.038 --rc genhtml_legend=1 00:05:29.038 --rc geninfo_all_blocks=1 00:05:29.038 --rc geninfo_unexecuted_blocks=1 00:05:29.038 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.038 ' 00:05:29.038 12:35:59 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:29.038 12:35:59 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:29.038 12:35:59 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:29.038 12:35:59 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1658 -- # local nvme bdf 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1660 -- # for nvme in /sys/block/nvme* 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1661 -- # is_block_zoned nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:32.330 12:36:02 setup.sh.devices -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- scripts/common.sh@381 -- # local block=nvme0n1 pt 00:05:32.330 12:36:02 setup.sh.devices -- scripts/common.sh@390 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:32.330 No valid GPT data, bailing 00:05:32.330 12:36:02 setup.sh.devices -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- scripts/common.sh@394 -- # pt= 00:05:32.330 12:36:02 setup.sh.devices -- scripts/common.sh@395 -- # return 1 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:32.330 12:36:02 setup.sh.devices -- setup/common.sh@80 -- # echo 4000787030016 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:32.330 12:36:02 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:32.331 12:36:02 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:32.331 12:36:02 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:32.331 12:36:02 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:32.591 ************************************ 00:05:32.591 START TEST nvme_mount 00:05:32.591 ************************************ 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1129 -- # nvme_mount 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:32.591 12:36:02 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:33.529 Creating new GPT entries in memory. 00:05:33.529 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:33.529 other utilities. 00:05:33.529 12:36:03 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:33.529 12:36:03 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:33.529 12:36:03 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:33.529 12:36:03 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:33.529 12:36:03 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:34.501 Creating new GPT entries in memory. 00:05:34.501 The operation has completed successfully. 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 580048 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:34.501 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:34.830 12:36:04 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.438 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:37.439 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:37.439 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:37.698 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:37.698 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:05:37.698 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:37.698 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:37.698 12:36:07 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:40.993 12:36:10 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:43.529 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:43.529 00:05:43.529 real 0m10.997s 00:05:43.529 user 0m2.918s 00:05:43.529 sys 0m5.783s 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:43.529 12:36:13 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:43.529 ************************************ 00:05:43.529 END TEST nvme_mount 00:05:43.529 ************************************ 00:05:43.529 12:36:13 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:43.529 12:36:13 setup.sh.devices -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:43.529 12:36:13 setup.sh.devices -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:43.529 12:36:13 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:43.529 ************************************ 00:05:43.529 START TEST dm_mount 00:05:43.530 ************************************ 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- common/autotest_common.sh@1129 -- # dm_mount 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:43.530 12:36:13 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:44.467 Creating new GPT entries in memory. 00:05:44.467 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:44.467 other utilities. 00:05:44.467 12:36:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:44.467 12:36:14 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:44.468 12:36:14 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:44.468 12:36:14 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:44.468 12:36:14 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:45.844 Creating new GPT entries in memory. 00:05:45.844 The operation has completed successfully. 00:05:45.844 12:36:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:45.844 12:36:15 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:45.844 12:36:15 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:45.844 12:36:15 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:45.844 12:36:15 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:46.783 The operation has completed successfully. 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 584152 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:46.784 12:36:16 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:50.074 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:50.075 12:36:19 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.610 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:52.611 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:52.611 00:05:52.611 real 0m8.889s 00:05:52.611 user 0m1.927s 00:05:52.611 sys 0m3.925s 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:52.611 12:36:22 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:52.611 ************************************ 00:05:52.611 END TEST dm_mount 00:05:52.611 ************************************ 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:52.611 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:52.611 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:05:52.611 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:52.611 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:52.611 12:36:22 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:52.870 12:36:22 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:52.870 12:36:22 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:52.870 12:36:22 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:52.870 12:36:22 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:52.870 12:36:22 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:52.870 00:05:52.870 real 0m23.822s 00:05:52.870 user 0m6.118s 00:05:52.870 sys 0m12.206s 00:05:52.870 12:36:22 setup.sh.devices -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:52.870 12:36:22 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:52.870 ************************************ 00:05:52.870 END TEST devices 00:05:52.870 ************************************ 00:05:52.870 00:05:52.870 real 1m25.892s 00:05:52.870 user 0m24.052s 00:05:52.870 sys 0m45.182s 00:05:52.870 12:36:22 setup.sh -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:52.870 12:36:22 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:52.870 ************************************ 00:05:52.870 END TEST setup.sh 00:05:52.870 ************************************ 00:05:52.870 12:36:22 -- spdk/autotest.sh@115 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:55.406 Hugepages 00:05:55.406 node hugesize free / total 00:05:55.406 node0 1048576kB 0 / 0 00:05:55.406 node0 2048kB 1024 / 1024 00:05:55.406 node1 1048576kB 0 / 0 00:05:55.406 node1 2048kB 1024 / 1024 00:05:55.406 00:05:55.406 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:55.406 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:55.406 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:55.406 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:05:55.406 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:55.406 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:55.406 12:36:25 -- spdk/autotest.sh@117 -- # uname -s 00:05:55.406 12:36:25 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:05:55.406 12:36:25 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:05:55.406 12:36:25 -- common/autotest_common.sh@1516 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:57.942 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:57.942 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:01.235 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:06:01.235 12:36:31 -- common/autotest_common.sh@1517 -- # sleep 1 00:06:02.614 12:36:32 -- common/autotest_common.sh@1518 -- # bdfs=() 00:06:02.614 12:36:32 -- common/autotest_common.sh@1518 -- # local bdfs 00:06:02.614 12:36:32 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:06:02.614 12:36:32 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:06:02.614 12:36:32 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:02.614 12:36:32 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:02.614 12:36:32 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:02.614 12:36:32 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:02.614 12:36:32 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:02.614 12:36:32 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:02.614 12:36:32 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:5e:00.0 00:06:02.614 12:36:32 -- common/autotest_common.sh@1522 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:06:05.147 Waiting for block devices as requested 00:06:05.147 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:06:05.147 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:05.406 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:05.406 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:05.406 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:05.665 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:05.665 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:05.665 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:05.924 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:05.924 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:05.924 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:05.924 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:06.184 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:06.184 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:06.184 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:06.443 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:06.443 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:06.443 12:36:36 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:06:06.443 12:36:36 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:06.443 12:36:36 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 00:06:06.443 12:36:36 -- common/autotest_common.sh@1487 -- # grep 0000:5e:00.0/nvme/nvme 00:06:06.443 12:36:36 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:06.444 12:36:36 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:06.444 12:36:36 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:06.444 12:36:36 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:06:06.444 12:36:36 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:06:06.444 12:36:36 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:06:06.444 12:36:36 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:06:06.444 12:36:36 -- common/autotest_common.sh@1531 -- # grep oacs 00:06:06.444 12:36:36 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:06:06.703 12:36:36 -- common/autotest_common.sh@1531 -- # oacs=' 0xe' 00:06:06.703 12:36:36 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:06:06.703 12:36:36 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:06:06.703 12:36:36 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:06:06.703 12:36:36 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:06:06.703 12:36:36 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:06:06.703 12:36:36 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:06:06.703 12:36:36 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:06:06.703 12:36:36 -- common/autotest_common.sh@1543 -- # continue 00:06:06.703 12:36:36 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:06:06.703 12:36:36 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:06.703 12:36:36 -- common/autotest_common.sh@10 -- # set +x 00:06:06.703 12:36:36 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:06:06.703 12:36:36 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:06.703 12:36:36 -- common/autotest_common.sh@10 -- # set +x 00:06:06.703 12:36:36 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:06:09.993 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:09.993 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:13.288 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:06:13.288 12:36:43 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:06:13.288 12:36:43 -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:13.288 12:36:43 -- common/autotest_common.sh@10 -- # set +x 00:06:13.288 12:36:43 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:06:13.288 12:36:43 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:06:13.288 12:36:43 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:06:13.288 12:36:43 -- common/autotest_common.sh@1563 -- # bdfs=() 00:06:13.288 12:36:43 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:06:13.288 12:36:43 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:06:13.288 12:36:43 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:06:13.288 12:36:43 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:06:13.288 12:36:43 -- common/autotest_common.sh@1498 -- # bdfs=() 00:06:13.288 12:36:43 -- common/autotest_common.sh@1498 -- # local bdfs 00:06:13.288 12:36:43 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:13.288 12:36:43 -- common/autotest_common.sh@1499 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:13.288 12:36:43 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:06:13.288 12:36:43 -- common/autotest_common.sh@1500 -- # (( 1 == 0 )) 00:06:13.288 12:36:43 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:5e:00.0 00:06:13.288 12:36:43 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:06:13.288 12:36:43 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:13.288 12:36:43 -- common/autotest_common.sh@1566 -- # device=0x0a54 00:06:13.288 12:36:43 -- common/autotest_common.sh@1567 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:13.288 12:36:43 -- common/autotest_common.sh@1568 -- # bdfs+=($bdf) 00:06:13.288 12:36:43 -- common/autotest_common.sh@1572 -- # (( 1 > 0 )) 00:06:13.288 12:36:43 -- common/autotest_common.sh@1573 -- # printf '%s\n' 0000:5e:00.0 00:06:13.288 12:36:43 -- common/autotest_common.sh@1579 -- # [[ -z 0000:5e:00.0 ]] 00:06:13.288 12:36:43 -- common/autotest_common.sh@1584 -- # spdk_tgt_pid=591930 00:06:13.288 12:36:43 -- common/autotest_common.sh@1583 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:13.288 12:36:43 -- common/autotest_common.sh@1585 -- # waitforlisten 591930 00:06:13.288 12:36:43 -- common/autotest_common.sh@835 -- # '[' -z 591930 ']' 00:06:13.288 12:36:43 -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.288 12:36:43 -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:13.288 12:36:43 -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.288 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.288 12:36:43 -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:13.288 12:36:43 -- common/autotest_common.sh@10 -- # set +x 00:06:13.288 [2024-11-28 12:36:43.307946] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:13.288 [2024-11-28 12:36:43.308010] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid591930 ] 00:06:13.549 [2024-11-28 12:36:43.445137] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:13.549 [2024-11-28 12:36:43.479074] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.549 [2024-11-28 12:36:43.502918] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.117 12:36:44 -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:14.117 12:36:44 -- common/autotest_common.sh@868 -- # return 0 00:06:14.117 12:36:44 -- common/autotest_common.sh@1587 -- # bdf_id=0 00:06:14.117 12:36:44 -- common/autotest_common.sh@1588 -- # for bdf in "${bdfs[@]}" 00:06:14.117 12:36:44 -- common/autotest_common.sh@1589 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:06:17.407 nvme0n1 00:06:17.407 12:36:47 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:17.407 [2024-11-28 12:36:47.337074] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:17.407 request: 00:06:17.407 { 00:06:17.407 "nvme_ctrlr_name": "nvme0", 00:06:17.407 "password": "test", 00:06:17.407 "method": "bdev_nvme_opal_revert", 00:06:17.407 "req_id": 1 00:06:17.407 } 00:06:17.407 Got JSON-RPC error response 00:06:17.407 response: 00:06:17.407 { 00:06:17.407 "code": -32602, 00:06:17.407 "message": "Invalid parameters" 00:06:17.407 } 00:06:17.407 12:36:47 -- common/autotest_common.sh@1591 -- # true 00:06:17.407 12:36:47 -- common/autotest_common.sh@1592 -- # (( ++bdf_id )) 00:06:17.407 12:36:47 -- common/autotest_common.sh@1595 -- # killprocess 591930 00:06:17.407 12:36:47 -- common/autotest_common.sh@954 -- # '[' -z 591930 ']' 00:06:17.407 12:36:47 -- common/autotest_common.sh@958 -- # kill -0 591930 00:06:17.407 12:36:47 -- common/autotest_common.sh@959 -- # uname 00:06:17.407 12:36:47 -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:17.407 12:36:47 -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 591930 00:06:17.407 12:36:47 -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:17.407 12:36:47 -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:17.407 12:36:47 -- common/autotest_common.sh@972 -- # echo 'killing process with pid 591930' 00:06:17.407 killing process with pid 591930 00:06:17.407 12:36:47 -- common/autotest_common.sh@973 -- # kill 591930 00:06:17.407 12:36:47 -- common/autotest_common.sh@978 -- # wait 591930 00:06:21.598 12:36:51 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:06:21.598 12:36:51 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:06:21.598 12:36:51 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:21.598 12:36:51 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:06:21.598 12:36:51 -- spdk/autotest.sh@149 -- # timing_enter lib 00:06:21.598 12:36:51 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:21.598 12:36:51 -- common/autotest_common.sh@10 -- # set +x 00:06:21.598 12:36:51 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:06:21.598 12:36:51 -- spdk/autotest.sh@155 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:21.598 12:36:51 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:21.598 12:36:51 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:21.598 12:36:51 -- common/autotest_common.sh@10 -- # set +x 00:06:21.598 ************************************ 00:06:21.598 START TEST env 00:06:21.598 ************************************ 00:06:21.598 12:36:51 env -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:06:21.598 * Looking for test storage... 00:06:21.598 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:06:21.598 12:36:51 env -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:21.598 12:36:51 env -- common/autotest_common.sh@1693 -- # lcov --version 00:06:21.598 12:36:51 env -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:21.598 12:36:51 env -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:21.598 12:36:51 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:21.598 12:36:51 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:21.598 12:36:51 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:21.599 12:36:51 env -- scripts/common.sh@336 -- # IFS=.-: 00:06:21.599 12:36:51 env -- scripts/common.sh@336 -- # read -ra ver1 00:06:21.599 12:36:51 env -- scripts/common.sh@337 -- # IFS=.-: 00:06:21.599 12:36:51 env -- scripts/common.sh@337 -- # read -ra ver2 00:06:21.599 12:36:51 env -- scripts/common.sh@338 -- # local 'op=<' 00:06:21.599 12:36:51 env -- scripts/common.sh@340 -- # ver1_l=2 00:06:21.599 12:36:51 env -- scripts/common.sh@341 -- # ver2_l=1 00:06:21.599 12:36:51 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:21.599 12:36:51 env -- scripts/common.sh@344 -- # case "$op" in 00:06:21.599 12:36:51 env -- scripts/common.sh@345 -- # : 1 00:06:21.599 12:36:51 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:21.599 12:36:51 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:21.599 12:36:51 env -- scripts/common.sh@365 -- # decimal 1 00:06:21.599 12:36:51 env -- scripts/common.sh@353 -- # local d=1 00:06:21.599 12:36:51 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:21.599 12:36:51 env -- scripts/common.sh@355 -- # echo 1 00:06:21.599 12:36:51 env -- scripts/common.sh@365 -- # ver1[v]=1 00:06:21.599 12:36:51 env -- scripts/common.sh@366 -- # decimal 2 00:06:21.599 12:36:51 env -- scripts/common.sh@353 -- # local d=2 00:06:21.599 12:36:51 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:21.599 12:36:51 env -- scripts/common.sh@355 -- # echo 2 00:06:21.599 12:36:51 env -- scripts/common.sh@366 -- # ver2[v]=2 00:06:21.599 12:36:51 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:21.599 12:36:51 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:21.599 12:36:51 env -- scripts/common.sh@368 -- # return 0 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.599 --rc genhtml_branch_coverage=1 00:06:21.599 --rc genhtml_function_coverage=1 00:06:21.599 --rc genhtml_legend=1 00:06:21.599 --rc geninfo_all_blocks=1 00:06:21.599 --rc geninfo_unexecuted_blocks=1 00:06:21.599 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.599 ' 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.599 --rc genhtml_branch_coverage=1 00:06:21.599 --rc genhtml_function_coverage=1 00:06:21.599 --rc genhtml_legend=1 00:06:21.599 --rc geninfo_all_blocks=1 00:06:21.599 --rc geninfo_unexecuted_blocks=1 00:06:21.599 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.599 ' 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.599 --rc genhtml_branch_coverage=1 00:06:21.599 --rc genhtml_function_coverage=1 00:06:21.599 --rc genhtml_legend=1 00:06:21.599 --rc geninfo_all_blocks=1 00:06:21.599 --rc geninfo_unexecuted_blocks=1 00:06:21.599 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.599 ' 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:21.599 --rc genhtml_branch_coverage=1 00:06:21.599 --rc genhtml_function_coverage=1 00:06:21.599 --rc genhtml_legend=1 00:06:21.599 --rc geninfo_all_blocks=1 00:06:21.599 --rc geninfo_unexecuted_blocks=1 00:06:21.599 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:21.599 ' 00:06:21.599 12:36:51 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:21.599 12:36:51 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:21.599 12:36:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:21.599 ************************************ 00:06:21.599 START TEST env_memory 00:06:21.599 ************************************ 00:06:21.599 12:36:51 env.env_memory -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:06:21.599 00:06:21.599 00:06:21.599 CUnit - A unit testing framework for C - Version 2.1-3 00:06:21.599 http://cunit.sourceforge.net/ 00:06:21.599 00:06:21.599 00:06:21.599 Suite: memory 00:06:21.599 Test: alloc and free memory map ...[2024-11-28 12:36:51.672999] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:21.599 passed 00:06:21.599 Test: mem map translation ...[2024-11-28 12:36:51.687415] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:21.599 [2024-11-28 12:36:51.687433] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 596:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:21.599 [2024-11-28 12:36:51.687464] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:21.599 [2024-11-28 12:36:51.687476] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:21.599 passed 00:06:21.599 Test: mem map registration ...[2024-11-28 12:36:51.707871] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:06:21.599 [2024-11-28 12:36:51.707885] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 348:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:06:21.599 passed 00:06:21.857 Test: mem map adjacent registrations ...passed 00:06:21.857 00:06:21.857 Run Summary: Type Total Ran Passed Failed Inactive 00:06:21.857 suites 1 1 n/a 0 0 00:06:21.857 tests 4 4 4 0 0 00:06:21.857 asserts 152 152 152 0 n/a 00:06:21.857 00:06:21.857 Elapsed time = 0.087 seconds 00:06:21.857 00:06:21.857 real 0m0.100s 00:06:21.857 user 0m0.085s 00:06:21.857 sys 0m0.014s 00:06:21.857 12:36:51 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:21.857 12:36:51 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:21.857 ************************************ 00:06:21.857 END TEST env_memory 00:06:21.857 ************************************ 00:06:21.857 12:36:51 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:21.857 12:36:51 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:21.857 12:36:51 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:21.857 12:36:51 env -- common/autotest_common.sh@10 -- # set +x 00:06:21.857 ************************************ 00:06:21.857 START TEST env_vtophys 00:06:21.857 ************************************ 00:06:21.857 12:36:51 env.env_vtophys -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:21.857 EAL: lib.eal log level changed from notice to debug 00:06:21.857 EAL: Detected lcore 0 as core 0 on socket 0 00:06:21.857 EAL: Detected lcore 1 as core 1 on socket 0 00:06:21.857 EAL: Detected lcore 2 as core 2 on socket 0 00:06:21.857 EAL: Detected lcore 3 as core 3 on socket 0 00:06:21.857 EAL: Detected lcore 4 as core 4 on socket 0 00:06:21.857 EAL: Detected lcore 5 as core 8 on socket 0 00:06:21.857 EAL: Detected lcore 6 as core 9 on socket 0 00:06:21.857 EAL: Detected lcore 7 as core 10 on socket 0 00:06:21.857 EAL: Detected lcore 8 as core 11 on socket 0 00:06:21.857 EAL: Detected lcore 9 as core 16 on socket 0 00:06:21.857 EAL: Detected lcore 10 as core 17 on socket 0 00:06:21.857 EAL: Detected lcore 11 as core 18 on socket 0 00:06:21.857 EAL: Detected lcore 12 as core 19 on socket 0 00:06:21.857 EAL: Detected lcore 13 as core 20 on socket 0 00:06:21.857 EAL: Detected lcore 14 as core 24 on socket 0 00:06:21.857 EAL: Detected lcore 15 as core 25 on socket 0 00:06:21.857 EAL: Detected lcore 16 as core 26 on socket 0 00:06:21.857 EAL: Detected lcore 17 as core 27 on socket 0 00:06:21.857 EAL: Detected lcore 18 as core 0 on socket 1 00:06:21.857 EAL: Detected lcore 19 as core 1 on socket 1 00:06:21.857 EAL: Detected lcore 20 as core 2 on socket 1 00:06:21.857 EAL: Detected lcore 21 as core 3 on socket 1 00:06:21.857 EAL: Detected lcore 22 as core 4 on socket 1 00:06:21.857 EAL: Detected lcore 23 as core 8 on socket 1 00:06:21.857 EAL: Detected lcore 24 as core 9 on socket 1 00:06:21.857 EAL: Detected lcore 25 as core 10 on socket 1 00:06:21.857 EAL: Detected lcore 26 as core 11 on socket 1 00:06:21.858 EAL: Detected lcore 27 as core 16 on socket 1 00:06:21.858 EAL: Detected lcore 28 as core 17 on socket 1 00:06:21.858 EAL: Detected lcore 29 as core 18 on socket 1 00:06:21.858 EAL: Detected lcore 30 as core 19 on socket 1 00:06:21.858 EAL: Detected lcore 31 as core 20 on socket 1 00:06:21.858 EAL: Detected lcore 32 as core 24 on socket 1 00:06:21.858 EAL: Detected lcore 33 as core 25 on socket 1 00:06:21.858 EAL: Detected lcore 34 as core 26 on socket 1 00:06:21.858 EAL: Detected lcore 35 as core 27 on socket 1 00:06:21.858 EAL: Detected lcore 36 as core 0 on socket 0 00:06:21.858 EAL: Detected lcore 37 as core 1 on socket 0 00:06:21.858 EAL: Detected lcore 38 as core 2 on socket 0 00:06:21.858 EAL: Detected lcore 39 as core 3 on socket 0 00:06:21.858 EAL: Detected lcore 40 as core 4 on socket 0 00:06:21.858 EAL: Detected lcore 41 as core 8 on socket 0 00:06:21.858 EAL: Detected lcore 42 as core 9 on socket 0 00:06:21.858 EAL: Detected lcore 43 as core 10 on socket 0 00:06:21.858 EAL: Detected lcore 44 as core 11 on socket 0 00:06:21.858 EAL: Detected lcore 45 as core 16 on socket 0 00:06:21.858 EAL: Detected lcore 46 as core 17 on socket 0 00:06:21.858 EAL: Detected lcore 47 as core 18 on socket 0 00:06:21.858 EAL: Detected lcore 48 as core 19 on socket 0 00:06:21.858 EAL: Detected lcore 49 as core 20 on socket 0 00:06:21.858 EAL: Detected lcore 50 as core 24 on socket 0 00:06:21.858 EAL: Detected lcore 51 as core 25 on socket 0 00:06:21.858 EAL: Detected lcore 52 as core 26 on socket 0 00:06:21.858 EAL: Detected lcore 53 as core 27 on socket 0 00:06:21.858 EAL: Detected lcore 54 as core 0 on socket 1 00:06:21.858 EAL: Detected lcore 55 as core 1 on socket 1 00:06:21.858 EAL: Detected lcore 56 as core 2 on socket 1 00:06:21.858 EAL: Detected lcore 57 as core 3 on socket 1 00:06:21.858 EAL: Detected lcore 58 as core 4 on socket 1 00:06:21.858 EAL: Detected lcore 59 as core 8 on socket 1 00:06:21.858 EAL: Detected lcore 60 as core 9 on socket 1 00:06:21.858 EAL: Detected lcore 61 as core 10 on socket 1 00:06:21.858 EAL: Detected lcore 62 as core 11 on socket 1 00:06:21.858 EAL: Detected lcore 63 as core 16 on socket 1 00:06:21.858 EAL: Detected lcore 64 as core 17 on socket 1 00:06:21.858 EAL: Detected lcore 65 as core 18 on socket 1 00:06:21.858 EAL: Detected lcore 66 as core 19 on socket 1 00:06:21.858 EAL: Detected lcore 67 as core 20 on socket 1 00:06:21.858 EAL: Detected lcore 68 as core 24 on socket 1 00:06:21.858 EAL: Detected lcore 69 as core 25 on socket 1 00:06:21.858 EAL: Detected lcore 70 as core 26 on socket 1 00:06:21.858 EAL: Detected lcore 71 as core 27 on socket 1 00:06:21.858 EAL: Maximum logical cores by configuration: 128 00:06:21.858 EAL: Detected CPU lcores: 72 00:06:21.858 EAL: Detected NUMA nodes: 2 00:06:21.858 EAL: Checking presence of .so 'librte_eal.so.25.0' 00:06:21.858 EAL: Checking presence of .so 'librte_eal.so.25' 00:06:21.858 EAL: Checking presence of .so 'librte_eal.so' 00:06:21.858 EAL: Detected static linkage of DPDK 00:06:21.858 EAL: No shared files mode enabled, IPC will be disabled 00:06:21.858 EAL: Bus pci wants IOVA as 'DC' 00:06:21.858 EAL: Buses did not request a specific IOVA mode. 00:06:21.858 EAL: IOMMU is available, selecting IOVA as VA mode. 00:06:21.858 EAL: Selected IOVA mode 'VA' 00:06:21.858 EAL: Probing VFIO support... 00:06:21.858 EAL: No shared files mode enabled, IPC is disabled 00:06:21.858 EAL: IOMMU type 1 (Type 1) is supported 00:06:21.858 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:21.858 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:21.858 EAL: VFIO support initialized 00:06:21.858 EAL: Ask a virtual area of 0x2e000 bytes 00:06:21.858 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:21.858 EAL: Setting up physically contiguous memory... 00:06:21.858 EAL: Setting maximum number of open files to 524288 00:06:21.858 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:21.858 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:21.858 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:21.858 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:21.858 EAL: Ask a virtual area of 0x61000 bytes 00:06:21.858 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:21.858 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:21.858 EAL: Ask a virtual area of 0x400000000 bytes 00:06:21.858 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:21.858 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:21.858 EAL: Hugepages will be freed exactly as allocated. 00:06:21.858 EAL: No shared files mode enabled, IPC is disabled 00:06:21.858 EAL: No shared files mode enabled, IPC is disabled 00:06:21.858 EAL: Refined arch frequency 2300000000 to measured frequency 2294605017 00:06:21.858 EAL: TSC frequency is ~2294600 KHz 00:06:21.858 EAL: Main lcore 0 is ready (tid=7f55f4a98a00;cpuset=[0]) 00:06:21.858 EAL: Trying to obtain current memory policy. 00:06:21.858 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:21.858 EAL: Restoring previous memory policy: 0 00:06:21.858 EAL: request: mp_malloc_sync 00:06:21.858 EAL: No shared files mode enabled, IPC is disabled 00:06:21.858 EAL: Heap on socket 0 was expanded by 2MB 00:06:21.858 EAL: Allocated 2112 bytes of per-lcore data with a 64-byte alignment 00:06:22.116 EAL: Mem event callback 'spdk:(nil)' registered 00:06:22.116 00:06:22.116 00:06:22.116 CUnit - A unit testing framework for C - Version 2.1-3 00:06:22.116 http://cunit.sourceforge.net/ 00:06:22.116 00:06:22.116 00:06:22.116 Suite: components_suite 00:06:22.116 Test: vtophys_malloc_test ...passed 00:06:22.116 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 4MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was shrunk by 4MB 00:06:22.116 EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 6MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was shrunk by 6MB 00:06:22.116 EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 10MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was shrunk by 10MB 00:06:22.116 EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 18MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was shrunk by 18MB 00:06:22.116 EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 34MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was shrunk by 34MB 00:06:22.116 EAL: Trying to obtain current memory policy. 00:06:22.116 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.116 EAL: Restoring previous memory policy: 4 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.116 EAL: request: mp_malloc_sync 00:06:22.116 EAL: No shared files mode enabled, IPC is disabled 00:06:22.116 EAL: Heap on socket 0 was expanded by 66MB 00:06:22.116 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.117 EAL: request: mp_malloc_sync 00:06:22.117 EAL: No shared files mode enabled, IPC is disabled 00:06:22.117 EAL: Heap on socket 0 was shrunk by 66MB 00:06:22.117 EAL: Trying to obtain current memory policy. 00:06:22.117 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.117 EAL: Restoring previous memory policy: 4 00:06:22.117 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.117 EAL: request: mp_malloc_sync 00:06:22.117 EAL: No shared files mode enabled, IPC is disabled 00:06:22.117 EAL: Heap on socket 0 was expanded by 130MB 00:06:22.117 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.117 EAL: request: mp_malloc_sync 00:06:22.117 EAL: No shared files mode enabled, IPC is disabled 00:06:22.117 EAL: Heap on socket 0 was shrunk by 130MB 00:06:22.117 EAL: Trying to obtain current memory policy. 00:06:22.117 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.117 EAL: Restoring previous memory policy: 4 00:06:22.117 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.117 EAL: request: mp_malloc_sync 00:06:22.117 EAL: No shared files mode enabled, IPC is disabled 00:06:22.117 EAL: Heap on socket 0 was expanded by 258MB 00:06:22.117 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.117 EAL: request: mp_malloc_sync 00:06:22.117 EAL: No shared files mode enabled, IPC is disabled 00:06:22.117 EAL: Heap on socket 0 was shrunk by 258MB 00:06:22.117 EAL: Trying to obtain current memory policy. 00:06:22.117 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.375 EAL: Restoring previous memory policy: 4 00:06:22.375 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.375 EAL: request: mp_malloc_sync 00:06:22.375 EAL: No shared files mode enabled, IPC is disabled 00:06:22.375 EAL: Heap on socket 0 was expanded by 514MB 00:06:22.375 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.375 EAL: request: mp_malloc_sync 00:06:22.375 EAL: No shared files mode enabled, IPC is disabled 00:06:22.375 EAL: Heap on socket 0 was shrunk by 514MB 00:06:22.375 EAL: Trying to obtain current memory policy. 00:06:22.375 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:22.633 EAL: Restoring previous memory policy: 4 00:06:22.633 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.633 EAL: request: mp_malloc_sync 00:06:22.633 EAL: No shared files mode enabled, IPC is disabled 00:06:22.633 EAL: Heap on socket 0 was expanded by 1026MB 00:06:22.892 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.892 EAL: request: mp_malloc_sync 00:06:22.892 EAL: No shared files mode enabled, IPC is disabled 00:06:22.892 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:22.892 passed 00:06:22.892 00:06:22.892 Run Summary: Type Total Ran Passed Failed Inactive 00:06:22.892 suites 1 1 n/a 0 0 00:06:22.892 tests 2 2 2 0 0 00:06:22.892 asserts 497 497 497 0 n/a 00:06:22.892 00:06:22.892 Elapsed time = 0.980 seconds 00:06:22.892 EAL: Calling mem event callback 'spdk:(nil)' 00:06:22.892 EAL: request: mp_malloc_sync 00:06:22.892 EAL: No shared files mode enabled, IPC is disabled 00:06:22.892 EAL: Heap on socket 0 was shrunk by 2MB 00:06:22.892 EAL: No shared files mode enabled, IPC is disabled 00:06:22.892 EAL: No shared files mode enabled, IPC is disabled 00:06:22.892 EAL: No shared files mode enabled, IPC is disabled 00:06:22.892 00:06:22.892 real 0m1.209s 00:06:22.892 user 0m0.634s 00:06:22.892 sys 0m0.442s 00:06:22.892 12:36:53 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:22.892 12:36:53 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:22.892 ************************************ 00:06:22.892 END TEST env_vtophys 00:06:22.892 ************************************ 00:06:23.151 12:36:53 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:23.151 12:36:53 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:23.151 12:36:53 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:23.151 12:36:53 env -- common/autotest_common.sh@10 -- # set +x 00:06:23.151 ************************************ 00:06:23.151 START TEST env_pci 00:06:23.151 ************************************ 00:06:23.151 12:36:53 env.env_pci -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:06:23.151 00:06:23.151 00:06:23.151 CUnit - A unit testing framework for C - Version 2.1-3 00:06:23.151 http://cunit.sourceforge.net/ 00:06:23.151 00:06:23.151 00:06:23.151 Suite: pci 00:06:23.151 Test: pci_hook ...[2024-11-28 12:36:53.103231] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1118:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 593270 has claimed it 00:06:23.151 EAL: Cannot find device (10000:00:01.0) 00:06:23.151 EAL: Failed to attach device on primary process 00:06:23.151 passed 00:06:23.151 00:06:23.151 Run Summary: Type Total Ran Passed Failed Inactive 00:06:23.151 suites 1 1 n/a 0 0 00:06:23.151 tests 1 1 1 0 0 00:06:23.151 asserts 25 25 25 0 n/a 00:06:23.151 00:06:23.151 Elapsed time = 0.034 seconds 00:06:23.151 00:06:23.151 real 0m0.053s 00:06:23.151 user 0m0.008s 00:06:23.151 sys 0m0.045s 00:06:23.151 12:36:53 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:23.151 12:36:53 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:23.151 ************************************ 00:06:23.151 END TEST env_pci 00:06:23.151 ************************************ 00:06:23.151 12:36:53 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:23.151 12:36:53 env -- env/env.sh@15 -- # uname 00:06:23.151 12:36:53 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:23.151 12:36:53 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:23.151 12:36:53 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:23.151 12:36:53 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:23.151 12:36:53 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:23.151 12:36:53 env -- common/autotest_common.sh@10 -- # set +x 00:06:23.151 ************************************ 00:06:23.151 START TEST env_dpdk_post_init 00:06:23.151 ************************************ 00:06:23.151 12:36:53 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:23.151 EAL: Detected CPU lcores: 72 00:06:23.151 EAL: Detected NUMA nodes: 2 00:06:23.151 EAL: Detected static linkage of DPDK 00:06:23.151 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:23.151 EAL: Selected IOVA mode 'VA' 00:06:23.151 EAL: VFIO support initialized 00:06:23.409 EAL: Using IOMMU type 1 (Type 1) 00:06:29.974 Starting DPDK initialization... 00:06:29.974 Starting SPDK post initialization... 00:06:29.974 SPDK NVMe probe 00:06:29.974 Attaching to 0000:5e:00.0 00:06:29.974 Attached to 0000:5e:00.0 00:06:29.974 Cleaning up... 00:06:29.974 00:06:29.974 real 0m6.569s 00:06:29.974 user 0m4.696s 00:06:29.974 sys 0m1.024s 00:06:29.974 12:36:59 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.974 12:36:59 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:29.974 ************************************ 00:06:29.974 END TEST env_dpdk_post_init 00:06:29.974 ************************************ 00:06:29.974 12:36:59 env -- env/env.sh@26 -- # uname 00:06:29.974 12:36:59 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:29.974 12:36:59 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:29.974 12:36:59 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:29.974 12:36:59 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:29.974 12:36:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.974 ************************************ 00:06:29.974 START TEST env_mem_callbacks 00:06:29.974 ************************************ 00:06:29.974 12:36:59 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:29.974 EAL: Detected CPU lcores: 72 00:06:29.974 EAL: Detected NUMA nodes: 2 00:06:29.974 EAL: Detected static linkage of DPDK 00:06:29.974 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:29.974 EAL: Selected IOVA mode 'VA' 00:06:29.974 EAL: VFIO support initialized 00:06:29.974 00:06:29.974 00:06:29.974 CUnit - A unit testing framework for C - Version 2.1-3 00:06:29.974 http://cunit.sourceforge.net/ 00:06:29.974 00:06:29.975 00:06:29.975 Suite: memory 00:06:29.975 Test: test ... 00:06:29.975 register 0x200000200000 2097152 00:06:29.975 malloc 3145728 00:06:29.975 register 0x200000400000 4194304 00:06:29.975 buf 0x200000500000 len 3145728 PASSED 00:06:29.975 malloc 64 00:06:29.975 buf 0x2000004fff40 len 64 PASSED 00:06:29.975 malloc 4194304 00:06:29.975 register 0x200000800000 6291456 00:06:29.975 buf 0x200000a00000 len 4194304 PASSED 00:06:29.975 free 0x200000500000 3145728 00:06:29.975 free 0x2000004fff40 64 00:06:29.975 unregister 0x200000400000 4194304 PASSED 00:06:29.975 free 0x200000a00000 4194304 00:06:29.975 unregister 0x200000800000 6291456 PASSED 00:06:29.975 malloc 8388608 00:06:29.975 register 0x200000400000 10485760 00:06:29.975 buf 0x200000600000 len 8388608 PASSED 00:06:29.975 free 0x200000600000 8388608 00:06:29.975 unregister 0x200000400000 10485760 PASSED 00:06:29.975 passed 00:06:29.975 00:06:29.975 Run Summary: Type Total Ran Passed Failed Inactive 00:06:29.975 suites 1 1 n/a 0 0 00:06:29.975 tests 1 1 1 0 0 00:06:29.975 asserts 15 15 15 0 n/a 00:06:29.975 00:06:29.975 Elapsed time = 0.006 seconds 00:06:29.975 00:06:29.975 real 0m0.168s 00:06:29.975 user 0m0.018s 00:06:29.975 sys 0m0.050s 00:06:29.975 12:37:00 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.975 12:37:00 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:29.975 ************************************ 00:06:29.975 END TEST env_mem_callbacks 00:06:29.975 ************************************ 00:06:29.975 00:06:29.975 real 0m8.672s 00:06:29.975 user 0m5.668s 00:06:29.975 sys 0m1.964s 00:06:29.975 12:37:00 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.975 12:37:00 env -- common/autotest_common.sh@10 -- # set +x 00:06:29.975 ************************************ 00:06:29.975 END TEST env 00:06:29.975 ************************************ 00:06:30.234 12:37:00 -- spdk/autotest.sh@156 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:30.234 12:37:00 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:30.234 12:37:00 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:30.234 12:37:00 -- common/autotest_common.sh@10 -- # set +x 00:06:30.234 ************************************ 00:06:30.234 START TEST rpc 00:06:30.234 ************************************ 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:06:30.234 * Looking for test storage... 00:06:30.234 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:30.234 12:37:00 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:30.234 12:37:00 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:30.234 12:37:00 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:30.234 12:37:00 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:30.234 12:37:00 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:30.234 12:37:00 rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:30.234 12:37:00 rpc -- scripts/common.sh@345 -- # : 1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:30.234 12:37:00 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:30.234 12:37:00 rpc -- scripts/common.sh@365 -- # decimal 1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@353 -- # local d=1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:30.234 12:37:00 rpc -- scripts/common.sh@355 -- # echo 1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:30.234 12:37:00 rpc -- scripts/common.sh@366 -- # decimal 2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@353 -- # local d=2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:30.234 12:37:00 rpc -- scripts/common.sh@355 -- # echo 2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:30.234 12:37:00 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:30.234 12:37:00 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:30.234 12:37:00 rpc -- scripts/common.sh@368 -- # return 0 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:30.234 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.234 --rc genhtml_branch_coverage=1 00:06:30.234 --rc genhtml_function_coverage=1 00:06:30.234 --rc genhtml_legend=1 00:06:30.234 --rc geninfo_all_blocks=1 00:06:30.234 --rc geninfo_unexecuted_blocks=1 00:06:30.234 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.234 ' 00:06:30.234 12:37:00 rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:30.234 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.234 --rc genhtml_branch_coverage=1 00:06:30.234 --rc genhtml_function_coverage=1 00:06:30.234 --rc genhtml_legend=1 00:06:30.234 --rc geninfo_all_blocks=1 00:06:30.234 --rc geninfo_unexecuted_blocks=1 00:06:30.234 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.234 ' 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:30.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.494 --rc genhtml_branch_coverage=1 00:06:30.494 --rc genhtml_function_coverage=1 00:06:30.494 --rc genhtml_legend=1 00:06:30.494 --rc geninfo_all_blocks=1 00:06:30.494 --rc geninfo_unexecuted_blocks=1 00:06:30.494 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.494 ' 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:30.494 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:30.494 --rc genhtml_branch_coverage=1 00:06:30.494 --rc genhtml_function_coverage=1 00:06:30.494 --rc genhtml_legend=1 00:06:30.494 --rc geninfo_all_blocks=1 00:06:30.494 --rc geninfo_unexecuted_blocks=1 00:06:30.494 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:30.494 ' 00:06:30.494 12:37:00 rpc -- rpc/rpc.sh@65 -- # spdk_pid=594272 00:06:30.494 12:37:00 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:30.494 12:37:00 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:30.494 12:37:00 rpc -- rpc/rpc.sh@67 -- # waitforlisten 594272 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@835 -- # '[' -z 594272 ']' 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:30.494 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:30.494 12:37:00 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:30.494 [2024-11-28 12:37:00.385086] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:30.494 [2024-11-28 12:37:00.385153] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid594272 ] 00:06:30.494 [2024-11-28 12:37:00.521352] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:30.494 [2024-11-28 12:37:00.557027] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.494 [2024-11-28 12:37:00.580492] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:30.494 [2024-11-28 12:37:00.580538] app.c: 616:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 594272' to capture a snapshot of events at runtime. 00:06:30.494 [2024-11-28 12:37:00.580549] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:30.494 [2024-11-28 12:37:00.580574] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:30.494 [2024-11-28 12:37:00.580581] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid594272 for offline analysis/debug. 00:06:30.494 [2024-11-28 12:37:00.580992] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.432 12:37:01 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:31.432 12:37:01 rpc -- common/autotest_common.sh@868 -- # return 0 00:06:31.432 12:37:01 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:31.432 12:37:01 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:31.432 12:37:01 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:31.432 12:37:01 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:31.432 12:37:01 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:31.432 12:37:01 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:31.432 12:37:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 ************************************ 00:06:31.432 START TEST rpc_integrity 00:06:31.432 ************************************ 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:31.432 { 00:06:31.432 "name": "Malloc0", 00:06:31.432 "aliases": [ 00:06:31.432 "ba032457-f44b-44e3-8a6b-5a13941d2806" 00:06:31.432 ], 00:06:31.432 "product_name": "Malloc disk", 00:06:31.432 "block_size": 512, 00:06:31.432 "num_blocks": 16384, 00:06:31.432 "uuid": "ba032457-f44b-44e3-8a6b-5a13941d2806", 00:06:31.432 "assigned_rate_limits": { 00:06:31.432 "rw_ios_per_sec": 0, 00:06:31.432 "rw_mbytes_per_sec": 0, 00:06:31.432 "r_mbytes_per_sec": 0, 00:06:31.432 "w_mbytes_per_sec": 0 00:06:31.432 }, 00:06:31.432 "claimed": false, 00:06:31.432 "zoned": false, 00:06:31.432 "supported_io_types": { 00:06:31.432 "read": true, 00:06:31.432 "write": true, 00:06:31.432 "unmap": true, 00:06:31.432 "flush": true, 00:06:31.432 "reset": true, 00:06:31.432 "nvme_admin": false, 00:06:31.432 "nvme_io": false, 00:06:31.432 "nvme_io_md": false, 00:06:31.432 "write_zeroes": true, 00:06:31.432 "zcopy": true, 00:06:31.432 "get_zone_info": false, 00:06:31.432 "zone_management": false, 00:06:31.432 "zone_append": false, 00:06:31.432 "compare": false, 00:06:31.432 "compare_and_write": false, 00:06:31.432 "abort": true, 00:06:31.432 "seek_hole": false, 00:06:31.432 "seek_data": false, 00:06:31.432 "copy": true, 00:06:31.432 "nvme_iov_md": false 00:06:31.432 }, 00:06:31.432 "memory_domains": [ 00:06:31.432 { 00:06:31.432 "dma_device_id": "system", 00:06:31.432 "dma_device_type": 1 00:06:31.432 }, 00:06:31.432 { 00:06:31.432 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.432 "dma_device_type": 2 00:06:31.432 } 00:06:31.432 ], 00:06:31.432 "driver_specific": {} 00:06:31.432 } 00:06:31.432 ]' 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 [2024-11-28 12:37:01.425099] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:31.432 [2024-11-28 12:37:01.425133] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:31.432 [2024-11-28 12:37:01.425150] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x498f400 00:06:31.432 [2024-11-28 12:37:01.425159] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:31.432 [2024-11-28 12:37:01.426105] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:31.432 [2024-11-28 12:37:01.426129] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:31.432 Passthru0 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.432 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.432 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:31.432 { 00:06:31.432 "name": "Malloc0", 00:06:31.433 "aliases": [ 00:06:31.433 "ba032457-f44b-44e3-8a6b-5a13941d2806" 00:06:31.433 ], 00:06:31.433 "product_name": "Malloc disk", 00:06:31.433 "block_size": 512, 00:06:31.433 "num_blocks": 16384, 00:06:31.433 "uuid": "ba032457-f44b-44e3-8a6b-5a13941d2806", 00:06:31.433 "assigned_rate_limits": { 00:06:31.433 "rw_ios_per_sec": 0, 00:06:31.433 "rw_mbytes_per_sec": 0, 00:06:31.433 "r_mbytes_per_sec": 0, 00:06:31.433 "w_mbytes_per_sec": 0 00:06:31.433 }, 00:06:31.433 "claimed": true, 00:06:31.433 "claim_type": "exclusive_write", 00:06:31.433 "zoned": false, 00:06:31.433 "supported_io_types": { 00:06:31.433 "read": true, 00:06:31.433 "write": true, 00:06:31.433 "unmap": true, 00:06:31.433 "flush": true, 00:06:31.433 "reset": true, 00:06:31.433 "nvme_admin": false, 00:06:31.433 "nvme_io": false, 00:06:31.433 "nvme_io_md": false, 00:06:31.433 "write_zeroes": true, 00:06:31.433 "zcopy": true, 00:06:31.433 "get_zone_info": false, 00:06:31.433 "zone_management": false, 00:06:31.433 "zone_append": false, 00:06:31.433 "compare": false, 00:06:31.433 "compare_and_write": false, 00:06:31.433 "abort": true, 00:06:31.433 "seek_hole": false, 00:06:31.433 "seek_data": false, 00:06:31.433 "copy": true, 00:06:31.433 "nvme_iov_md": false 00:06:31.433 }, 00:06:31.433 "memory_domains": [ 00:06:31.433 { 00:06:31.433 "dma_device_id": "system", 00:06:31.433 "dma_device_type": 1 00:06:31.433 }, 00:06:31.433 { 00:06:31.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.433 "dma_device_type": 2 00:06:31.433 } 00:06:31.433 ], 00:06:31.433 "driver_specific": {} 00:06:31.433 }, 00:06:31.433 { 00:06:31.433 "name": "Passthru0", 00:06:31.433 "aliases": [ 00:06:31.433 "dcd11e4a-ed08-52ed-8f9f-e519089810b4" 00:06:31.433 ], 00:06:31.433 "product_name": "passthru", 00:06:31.433 "block_size": 512, 00:06:31.433 "num_blocks": 16384, 00:06:31.433 "uuid": "dcd11e4a-ed08-52ed-8f9f-e519089810b4", 00:06:31.433 "assigned_rate_limits": { 00:06:31.433 "rw_ios_per_sec": 0, 00:06:31.433 "rw_mbytes_per_sec": 0, 00:06:31.433 "r_mbytes_per_sec": 0, 00:06:31.433 "w_mbytes_per_sec": 0 00:06:31.433 }, 00:06:31.433 "claimed": false, 00:06:31.433 "zoned": false, 00:06:31.433 "supported_io_types": { 00:06:31.433 "read": true, 00:06:31.433 "write": true, 00:06:31.433 "unmap": true, 00:06:31.433 "flush": true, 00:06:31.433 "reset": true, 00:06:31.433 "nvme_admin": false, 00:06:31.433 "nvme_io": false, 00:06:31.433 "nvme_io_md": false, 00:06:31.433 "write_zeroes": true, 00:06:31.433 "zcopy": true, 00:06:31.433 "get_zone_info": false, 00:06:31.433 "zone_management": false, 00:06:31.433 "zone_append": false, 00:06:31.433 "compare": false, 00:06:31.433 "compare_and_write": false, 00:06:31.433 "abort": true, 00:06:31.433 "seek_hole": false, 00:06:31.433 "seek_data": false, 00:06:31.433 "copy": true, 00:06:31.433 "nvme_iov_md": false 00:06:31.433 }, 00:06:31.433 "memory_domains": [ 00:06:31.433 { 00:06:31.433 "dma_device_id": "system", 00:06:31.433 "dma_device_type": 1 00:06:31.433 }, 00:06:31.433 { 00:06:31.433 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.433 "dma_device_type": 2 00:06:31.433 } 00:06:31.433 ], 00:06:31.433 "driver_specific": { 00:06:31.433 "passthru": { 00:06:31.433 "name": "Passthru0", 00:06:31.433 "base_bdev_name": "Malloc0" 00:06:31.433 } 00:06:31.433 } 00:06:31.433 } 00:06:31.433 ]' 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:31.433 12:37:01 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:31.433 00:06:31.433 real 0m0.276s 00:06:31.433 user 0m0.160s 00:06:31.433 sys 0m0.053s 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:31.433 12:37:01 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:31.433 ************************************ 00:06:31.433 END TEST rpc_integrity 00:06:31.433 ************************************ 00:06:31.693 12:37:01 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 ************************************ 00:06:31.693 START TEST rpc_plugins 00:06:31.693 ************************************ 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:31.693 { 00:06:31.693 "name": "Malloc1", 00:06:31.693 "aliases": [ 00:06:31.693 "ac56380e-a33d-4926-85c9-8d75ebd021dd" 00:06:31.693 ], 00:06:31.693 "product_name": "Malloc disk", 00:06:31.693 "block_size": 4096, 00:06:31.693 "num_blocks": 256, 00:06:31.693 "uuid": "ac56380e-a33d-4926-85c9-8d75ebd021dd", 00:06:31.693 "assigned_rate_limits": { 00:06:31.693 "rw_ios_per_sec": 0, 00:06:31.693 "rw_mbytes_per_sec": 0, 00:06:31.693 "r_mbytes_per_sec": 0, 00:06:31.693 "w_mbytes_per_sec": 0 00:06:31.693 }, 00:06:31.693 "claimed": false, 00:06:31.693 "zoned": false, 00:06:31.693 "supported_io_types": { 00:06:31.693 "read": true, 00:06:31.693 "write": true, 00:06:31.693 "unmap": true, 00:06:31.693 "flush": true, 00:06:31.693 "reset": true, 00:06:31.693 "nvme_admin": false, 00:06:31.693 "nvme_io": false, 00:06:31.693 "nvme_io_md": false, 00:06:31.693 "write_zeroes": true, 00:06:31.693 "zcopy": true, 00:06:31.693 "get_zone_info": false, 00:06:31.693 "zone_management": false, 00:06:31.693 "zone_append": false, 00:06:31.693 "compare": false, 00:06:31.693 "compare_and_write": false, 00:06:31.693 "abort": true, 00:06:31.693 "seek_hole": false, 00:06:31.693 "seek_data": false, 00:06:31.693 "copy": true, 00:06:31.693 "nvme_iov_md": false 00:06:31.693 }, 00:06:31.693 "memory_domains": [ 00:06:31.693 { 00:06:31.693 "dma_device_id": "system", 00:06:31.693 "dma_device_type": 1 00:06:31.693 }, 00:06:31.693 { 00:06:31.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:31.693 "dma_device_type": 2 00:06:31.693 } 00:06:31.693 ], 00:06:31.693 "driver_specific": {} 00:06:31.693 } 00:06:31.693 ]' 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:31.693 12:37:01 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:31.693 00:06:31.693 real 0m0.132s 00:06:31.693 user 0m0.080s 00:06:31.693 sys 0m0.020s 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:31.693 12:37:01 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:31.693 ************************************ 00:06:31.693 END TEST rpc_plugins 00:06:31.693 ************************************ 00:06:31.693 12:37:01 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:31.693 12:37:01 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:31.952 ************************************ 00:06:31.952 START TEST rpc_trace_cmd_test 00:06:31.952 ************************************ 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:31.952 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:31.952 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid594272", 00:06:31.952 "tpoint_group_mask": "0x8", 00:06:31.952 "iscsi_conn": { 00:06:31.952 "mask": "0x2", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "scsi": { 00:06:31.952 "mask": "0x4", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "bdev": { 00:06:31.952 "mask": "0x8", 00:06:31.952 "tpoint_mask": "0xffffffffffffffff" 00:06:31.952 }, 00:06:31.952 "nvmf_rdma": { 00:06:31.952 "mask": "0x10", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "nvmf_tcp": { 00:06:31.952 "mask": "0x20", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "ftl": { 00:06:31.952 "mask": "0x40", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "blobfs": { 00:06:31.952 "mask": "0x80", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "dsa": { 00:06:31.952 "mask": "0x200", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "thread": { 00:06:31.952 "mask": "0x400", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "nvme_pcie": { 00:06:31.952 "mask": "0x800", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "iaa": { 00:06:31.952 "mask": "0x1000", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "nvme_tcp": { 00:06:31.952 "mask": "0x2000", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.952 }, 00:06:31.952 "bdev_nvme": { 00:06:31.952 "mask": "0x4000", 00:06:31.952 "tpoint_mask": "0x0" 00:06:31.953 }, 00:06:31.953 "sock": { 00:06:31.953 "mask": "0x8000", 00:06:31.953 "tpoint_mask": "0x0" 00:06:31.953 }, 00:06:31.953 "blob": { 00:06:31.953 "mask": "0x10000", 00:06:31.953 "tpoint_mask": "0x0" 00:06:31.953 }, 00:06:31.953 "bdev_raid": { 00:06:31.953 "mask": "0x20000", 00:06:31.953 "tpoint_mask": "0x0" 00:06:31.953 }, 00:06:31.953 "scheduler": { 00:06:31.953 "mask": "0x40000", 00:06:31.953 "tpoint_mask": "0x0" 00:06:31.953 } 00:06:31.953 }' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:31.953 12:37:01 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:31.953 12:37:02 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:31.953 12:37:02 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:31.953 12:37:02 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:31.953 00:06:31.953 real 0m0.215s 00:06:31.953 user 0m0.180s 00:06:31.953 sys 0m0.028s 00:06:31.953 12:37:02 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:31.953 12:37:02 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:31.953 ************************************ 00:06:31.953 END TEST rpc_trace_cmd_test 00:06:31.953 ************************************ 00:06:32.212 12:37:02 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:32.212 12:37:02 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:32.212 12:37:02 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:32.212 12:37:02 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:32.212 12:37:02 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:32.212 12:37:02 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.212 ************************************ 00:06:32.212 START TEST rpc_daemon_integrity 00:06:32.212 ************************************ 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:32.212 { 00:06:32.212 "name": "Malloc2", 00:06:32.212 "aliases": [ 00:06:32.212 "04193263-427f-4e91-ae6a-904a968b9251" 00:06:32.212 ], 00:06:32.212 "product_name": "Malloc disk", 00:06:32.212 "block_size": 512, 00:06:32.212 "num_blocks": 16384, 00:06:32.212 "uuid": "04193263-427f-4e91-ae6a-904a968b9251", 00:06:32.212 "assigned_rate_limits": { 00:06:32.212 "rw_ios_per_sec": 0, 00:06:32.212 "rw_mbytes_per_sec": 0, 00:06:32.212 "r_mbytes_per_sec": 0, 00:06:32.212 "w_mbytes_per_sec": 0 00:06:32.212 }, 00:06:32.212 "claimed": false, 00:06:32.212 "zoned": false, 00:06:32.212 "supported_io_types": { 00:06:32.212 "read": true, 00:06:32.212 "write": true, 00:06:32.212 "unmap": true, 00:06:32.212 "flush": true, 00:06:32.212 "reset": true, 00:06:32.212 "nvme_admin": false, 00:06:32.212 "nvme_io": false, 00:06:32.212 "nvme_io_md": false, 00:06:32.212 "write_zeroes": true, 00:06:32.212 "zcopy": true, 00:06:32.212 "get_zone_info": false, 00:06:32.212 "zone_management": false, 00:06:32.212 "zone_append": false, 00:06:32.212 "compare": false, 00:06:32.212 "compare_and_write": false, 00:06:32.212 "abort": true, 00:06:32.212 "seek_hole": false, 00:06:32.212 "seek_data": false, 00:06:32.212 "copy": true, 00:06:32.212 "nvme_iov_md": false 00:06:32.212 }, 00:06:32.212 "memory_domains": [ 00:06:32.212 { 00:06:32.212 "dma_device_id": "system", 00:06:32.212 "dma_device_type": 1 00:06:32.212 }, 00:06:32.212 { 00:06:32.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:32.212 "dma_device_type": 2 00:06:32.212 } 00:06:32.212 ], 00:06:32.212 "driver_specific": {} 00:06:32.212 } 00:06:32.212 ]' 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.212 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.212 [2024-11-28 12:37:02.253290] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:32.212 [2024-11-28 12:37:02.253323] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:32.213 [2024-11-28 12:37:02.253347] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x4983440 00:06:32.213 [2024-11-28 12:37:02.253358] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:32.213 [2024-11-28 12:37:02.254212] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:32.213 [2024-11-28 12:37:02.254235] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:32.213 Passthru0 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:32.213 { 00:06:32.213 "name": "Malloc2", 00:06:32.213 "aliases": [ 00:06:32.213 "04193263-427f-4e91-ae6a-904a968b9251" 00:06:32.213 ], 00:06:32.213 "product_name": "Malloc disk", 00:06:32.213 "block_size": 512, 00:06:32.213 "num_blocks": 16384, 00:06:32.213 "uuid": "04193263-427f-4e91-ae6a-904a968b9251", 00:06:32.213 "assigned_rate_limits": { 00:06:32.213 "rw_ios_per_sec": 0, 00:06:32.213 "rw_mbytes_per_sec": 0, 00:06:32.213 "r_mbytes_per_sec": 0, 00:06:32.213 "w_mbytes_per_sec": 0 00:06:32.213 }, 00:06:32.213 "claimed": true, 00:06:32.213 "claim_type": "exclusive_write", 00:06:32.213 "zoned": false, 00:06:32.213 "supported_io_types": { 00:06:32.213 "read": true, 00:06:32.213 "write": true, 00:06:32.213 "unmap": true, 00:06:32.213 "flush": true, 00:06:32.213 "reset": true, 00:06:32.213 "nvme_admin": false, 00:06:32.213 "nvme_io": false, 00:06:32.213 "nvme_io_md": false, 00:06:32.213 "write_zeroes": true, 00:06:32.213 "zcopy": true, 00:06:32.213 "get_zone_info": false, 00:06:32.213 "zone_management": false, 00:06:32.213 "zone_append": false, 00:06:32.213 "compare": false, 00:06:32.213 "compare_and_write": false, 00:06:32.213 "abort": true, 00:06:32.213 "seek_hole": false, 00:06:32.213 "seek_data": false, 00:06:32.213 "copy": true, 00:06:32.213 "nvme_iov_md": false 00:06:32.213 }, 00:06:32.213 "memory_domains": [ 00:06:32.213 { 00:06:32.213 "dma_device_id": "system", 00:06:32.213 "dma_device_type": 1 00:06:32.213 }, 00:06:32.213 { 00:06:32.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:32.213 "dma_device_type": 2 00:06:32.213 } 00:06:32.213 ], 00:06:32.213 "driver_specific": {} 00:06:32.213 }, 00:06:32.213 { 00:06:32.213 "name": "Passthru0", 00:06:32.213 "aliases": [ 00:06:32.213 "c74543db-0653-51bb-a2fe-01ee73150ab6" 00:06:32.213 ], 00:06:32.213 "product_name": "passthru", 00:06:32.213 "block_size": 512, 00:06:32.213 "num_blocks": 16384, 00:06:32.213 "uuid": "c74543db-0653-51bb-a2fe-01ee73150ab6", 00:06:32.213 "assigned_rate_limits": { 00:06:32.213 "rw_ios_per_sec": 0, 00:06:32.213 "rw_mbytes_per_sec": 0, 00:06:32.213 "r_mbytes_per_sec": 0, 00:06:32.213 "w_mbytes_per_sec": 0 00:06:32.213 }, 00:06:32.213 "claimed": false, 00:06:32.213 "zoned": false, 00:06:32.213 "supported_io_types": { 00:06:32.213 "read": true, 00:06:32.213 "write": true, 00:06:32.213 "unmap": true, 00:06:32.213 "flush": true, 00:06:32.213 "reset": true, 00:06:32.213 "nvme_admin": false, 00:06:32.213 "nvme_io": false, 00:06:32.213 "nvme_io_md": false, 00:06:32.213 "write_zeroes": true, 00:06:32.213 "zcopy": true, 00:06:32.213 "get_zone_info": false, 00:06:32.213 "zone_management": false, 00:06:32.213 "zone_append": false, 00:06:32.213 "compare": false, 00:06:32.213 "compare_and_write": false, 00:06:32.213 "abort": true, 00:06:32.213 "seek_hole": false, 00:06:32.213 "seek_data": false, 00:06:32.213 "copy": true, 00:06:32.213 "nvme_iov_md": false 00:06:32.213 }, 00:06:32.213 "memory_domains": [ 00:06:32.213 { 00:06:32.213 "dma_device_id": "system", 00:06:32.213 "dma_device_type": 1 00:06:32.213 }, 00:06:32.213 { 00:06:32.213 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:32.213 "dma_device_type": 2 00:06:32.213 } 00:06:32.213 ], 00:06:32.213 "driver_specific": { 00:06:32.213 "passthru": { 00:06:32.213 "name": "Passthru0", 00:06:32.213 "base_bdev_name": "Malloc2" 00:06:32.213 } 00:06:32.213 } 00:06:32.213 } 00:06:32.213 ]' 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.213 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:32.472 00:06:32.472 real 0m0.252s 00:06:32.472 user 0m0.144s 00:06:32.472 sys 0m0.051s 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:32.472 12:37:02 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:32.472 ************************************ 00:06:32.472 END TEST rpc_daemon_integrity 00:06:32.472 ************************************ 00:06:32.472 12:37:02 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:32.472 12:37:02 rpc -- rpc/rpc.sh@84 -- # killprocess 594272 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@954 -- # '[' -z 594272 ']' 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@958 -- # kill -0 594272 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@959 -- # uname 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 594272 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 594272' 00:06:32.472 killing process with pid 594272 00:06:32.472 12:37:02 rpc -- common/autotest_common.sh@973 -- # kill 594272 00:06:32.473 12:37:02 rpc -- common/autotest_common.sh@978 -- # wait 594272 00:06:32.732 00:06:32.732 real 0m2.615s 00:06:32.732 user 0m3.159s 00:06:32.732 sys 0m0.837s 00:06:32.732 12:37:02 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:32.732 12:37:02 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.732 ************************************ 00:06:32.732 END TEST rpc 00:06:32.732 ************************************ 00:06:32.732 12:37:02 -- spdk/autotest.sh@157 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:32.732 12:37:02 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:32.732 12:37:02 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:32.732 12:37:02 -- common/autotest_common.sh@10 -- # set +x 00:06:32.992 ************************************ 00:06:32.992 START TEST skip_rpc 00:06:32.992 ************************************ 00:06:32.992 12:37:02 skip_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:32.992 * Looking for test storage... 00:06:32.992 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:06:32.992 12:37:02 skip_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:32.992 12:37:02 skip_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:32.992 12:37:02 skip_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:32.992 12:37:03 skip_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@345 -- # : 1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:32.992 12:37:03 skip_rpc -- scripts/common.sh@368 -- # return 0 00:06:32.992 12:37:03 skip_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:32.992 12:37:03 skip_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:32.992 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.992 --rc genhtml_branch_coverage=1 00:06:32.992 --rc genhtml_function_coverage=1 00:06:32.992 --rc genhtml_legend=1 00:06:32.992 --rc geninfo_all_blocks=1 00:06:32.992 --rc geninfo_unexecuted_blocks=1 00:06:32.993 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.993 ' 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:32.993 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.993 --rc genhtml_branch_coverage=1 00:06:32.993 --rc genhtml_function_coverage=1 00:06:32.993 --rc genhtml_legend=1 00:06:32.993 --rc geninfo_all_blocks=1 00:06:32.993 --rc geninfo_unexecuted_blocks=1 00:06:32.993 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.993 ' 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:32.993 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.993 --rc genhtml_branch_coverage=1 00:06:32.993 --rc genhtml_function_coverage=1 00:06:32.993 --rc genhtml_legend=1 00:06:32.993 --rc geninfo_all_blocks=1 00:06:32.993 --rc geninfo_unexecuted_blocks=1 00:06:32.993 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.993 ' 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:32.993 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:32.993 --rc genhtml_branch_coverage=1 00:06:32.993 --rc genhtml_function_coverage=1 00:06:32.993 --rc genhtml_legend=1 00:06:32.993 --rc geninfo_all_blocks=1 00:06:32.993 --rc geninfo_unexecuted_blocks=1 00:06:32.993 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:32.993 ' 00:06:32.993 12:37:03 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:32.993 12:37:03 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:32.993 12:37:03 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:32.993 12:37:03 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:32.993 ************************************ 00:06:32.993 START TEST skip_rpc 00:06:32.993 ************************************ 00:06:32.993 12:37:03 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:06:32.993 12:37:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:32.993 12:37:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=594805 00:06:32.993 12:37:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:32.993 12:37:03 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:32.993 [2024-11-28 12:37:03.104989] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:32.993 [2024-11-28 12:37:03.105046] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid594805 ] 00:06:33.252 [2024-11-28 12:37:03.239189] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:33.252 [2024-11-28 12:37:03.273128] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.252 [2024-11-28 12:37:03.299254] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 594805 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 594805 ']' 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 594805 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 594805 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 594805' 00:06:38.527 killing process with pid 594805 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 594805 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 594805 00:06:38.527 00:06:38.527 real 0m5.365s 00:06:38.527 user 0m5.008s 00:06:38.527 sys 0m0.288s 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:38.527 12:37:08 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.527 ************************************ 00:06:38.527 END TEST skip_rpc 00:06:38.527 ************************************ 00:06:38.527 12:37:08 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:38.527 12:37:08 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:38.527 12:37:08 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:38.527 12:37:08 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:38.527 ************************************ 00:06:38.527 START TEST skip_rpc_with_json 00:06:38.527 ************************************ 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=595548 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 595548 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 595548 ']' 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:38.527 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:38.527 12:37:08 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:38.527 [2024-11-28 12:37:08.561578] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:38.527 [2024-11-28 12:37:08.561647] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid595548 ] 00:06:38.787 [2024-11-28 12:37:08.698663] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:38.787 [2024-11-28 12:37:08.733390] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.787 [2024-11-28 12:37:08.757607] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:39.356 [2024-11-28 12:37:09.423369] nvmf_rpc.c:2706:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:39.356 request: 00:06:39.356 { 00:06:39.356 "trtype": "tcp", 00:06:39.356 "method": "nvmf_get_transports", 00:06:39.356 "req_id": 1 00:06:39.356 } 00:06:39.356 Got JSON-RPC error response 00:06:39.356 response: 00:06:39.356 { 00:06:39.356 "code": -19, 00:06:39.356 "message": "No such device" 00:06:39.356 } 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:39.356 [2024-11-28 12:37:09.435439] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:39.356 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:39.616 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:39.616 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:39.616 { 00:06:39.616 "subsystems": [ 00:06:39.616 { 00:06:39.616 "subsystem": "scheduler", 00:06:39.616 "config": [ 00:06:39.616 { 00:06:39.616 "method": "framework_set_scheduler", 00:06:39.616 "params": { 00:06:39.616 "name": "static" 00:06:39.616 } 00:06:39.616 } 00:06:39.616 ] 00:06:39.616 }, 00:06:39.616 { 00:06:39.616 "subsystem": "vmd", 00:06:39.616 "config": [] 00:06:39.616 }, 00:06:39.616 { 00:06:39.616 "subsystem": "sock", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "sock_set_default_impl", 00:06:39.617 "params": { 00:06:39.617 "impl_name": "posix" 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "sock_impl_set_options", 00:06:39.617 "params": { 00:06:39.617 "impl_name": "ssl", 00:06:39.617 "recv_buf_size": 4096, 00:06:39.617 "send_buf_size": 4096, 00:06:39.617 "enable_recv_pipe": true, 00:06:39.617 "enable_quickack": false, 00:06:39.617 "enable_placement_id": 0, 00:06:39.617 "enable_zerocopy_send_server": true, 00:06:39.617 "enable_zerocopy_send_client": false, 00:06:39.617 "zerocopy_threshold": 0, 00:06:39.617 "tls_version": 0, 00:06:39.617 "enable_ktls": false 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "sock_impl_set_options", 00:06:39.617 "params": { 00:06:39.617 "impl_name": "posix", 00:06:39.617 "recv_buf_size": 2097152, 00:06:39.617 "send_buf_size": 2097152, 00:06:39.617 "enable_recv_pipe": true, 00:06:39.617 "enable_quickack": false, 00:06:39.617 "enable_placement_id": 0, 00:06:39.617 "enable_zerocopy_send_server": true, 00:06:39.617 "enable_zerocopy_send_client": false, 00:06:39.617 "zerocopy_threshold": 0, 00:06:39.617 "tls_version": 0, 00:06:39.617 "enable_ktls": false 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "iobuf", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "iobuf_set_options", 00:06:39.617 "params": { 00:06:39.617 "small_pool_count": 8192, 00:06:39.617 "large_pool_count": 1024, 00:06:39.617 "small_bufsize": 8192, 00:06:39.617 "large_bufsize": 135168, 00:06:39.617 "enable_numa": false 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "keyring", 00:06:39.617 "config": [] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "vfio_user_target", 00:06:39.617 "config": null 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "fsdev", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "fsdev_set_opts", 00:06:39.617 "params": { 00:06:39.617 "fsdev_io_pool_size": 65535, 00:06:39.617 "fsdev_io_cache_size": 256 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "accel", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "accel_set_options", 00:06:39.617 "params": { 00:06:39.617 "small_cache_size": 128, 00:06:39.617 "large_cache_size": 16, 00:06:39.617 "task_count": 2048, 00:06:39.617 "sequence_count": 2048, 00:06:39.617 "buf_count": 2048 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "bdev", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "bdev_set_options", 00:06:39.617 "params": { 00:06:39.617 "bdev_io_pool_size": 65535, 00:06:39.617 "bdev_io_cache_size": 256, 00:06:39.617 "bdev_auto_examine": true, 00:06:39.617 "iobuf_small_cache_size": 128, 00:06:39.617 "iobuf_large_cache_size": 16 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "bdev_raid_set_options", 00:06:39.617 "params": { 00:06:39.617 "process_window_size_kb": 1024, 00:06:39.617 "process_max_bandwidth_mb_sec": 0 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "bdev_nvme_set_options", 00:06:39.617 "params": { 00:06:39.617 "action_on_timeout": "none", 00:06:39.617 "timeout_us": 0, 00:06:39.617 "timeout_admin_us": 0, 00:06:39.617 "keep_alive_timeout_ms": 10000, 00:06:39.617 "arbitration_burst": 0, 00:06:39.617 "low_priority_weight": 0, 00:06:39.617 "medium_priority_weight": 0, 00:06:39.617 "high_priority_weight": 0, 00:06:39.617 "nvme_adminq_poll_period_us": 10000, 00:06:39.617 "nvme_ioq_poll_period_us": 0, 00:06:39.617 "io_queue_requests": 0, 00:06:39.617 "delay_cmd_submit": true, 00:06:39.617 "transport_retry_count": 4, 00:06:39.617 "bdev_retry_count": 3, 00:06:39.617 "transport_ack_timeout": 0, 00:06:39.617 "ctrlr_loss_timeout_sec": 0, 00:06:39.617 "reconnect_delay_sec": 0, 00:06:39.617 "fast_io_fail_timeout_sec": 0, 00:06:39.617 "disable_auto_failback": false, 00:06:39.617 "generate_uuids": false, 00:06:39.617 "transport_tos": 0, 00:06:39.617 "nvme_error_stat": false, 00:06:39.617 "rdma_srq_size": 0, 00:06:39.617 "io_path_stat": false, 00:06:39.617 "allow_accel_sequence": false, 00:06:39.617 "rdma_max_cq_size": 0, 00:06:39.617 "rdma_cm_event_timeout_ms": 0, 00:06:39.617 "dhchap_digests": [ 00:06:39.617 "sha256", 00:06:39.617 "sha384", 00:06:39.617 "sha512" 00:06:39.617 ], 00:06:39.617 "dhchap_dhgroups": [ 00:06:39.617 "null", 00:06:39.617 "ffdhe2048", 00:06:39.617 "ffdhe3072", 00:06:39.617 "ffdhe4096", 00:06:39.617 "ffdhe6144", 00:06:39.617 "ffdhe8192" 00:06:39.617 ] 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "bdev_nvme_set_hotplug", 00:06:39.617 "params": { 00:06:39.617 "period_us": 100000, 00:06:39.617 "enable": false 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "bdev_iscsi_set_options", 00:06:39.617 "params": { 00:06:39.617 "timeout_sec": 30 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "bdev_wait_for_examine" 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "nvmf", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "nvmf_set_config", 00:06:39.617 "params": { 00:06:39.617 "discovery_filter": "match_any", 00:06:39.617 "admin_cmd_passthru": { 00:06:39.617 "identify_ctrlr": false 00:06:39.617 }, 00:06:39.617 "dhchap_digests": [ 00:06:39.617 "sha256", 00:06:39.617 "sha384", 00:06:39.617 "sha512" 00:06:39.617 ], 00:06:39.617 "dhchap_dhgroups": [ 00:06:39.617 "null", 00:06:39.617 "ffdhe2048", 00:06:39.617 "ffdhe3072", 00:06:39.617 "ffdhe4096", 00:06:39.617 "ffdhe6144", 00:06:39.617 "ffdhe8192" 00:06:39.617 ] 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "nvmf_set_max_subsystems", 00:06:39.617 "params": { 00:06:39.617 "max_subsystems": 1024 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "nvmf_set_crdt", 00:06:39.617 "params": { 00:06:39.617 "crdt1": 0, 00:06:39.617 "crdt2": 0, 00:06:39.617 "crdt3": 0 00:06:39.617 } 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "method": "nvmf_create_transport", 00:06:39.617 "params": { 00:06:39.617 "trtype": "TCP", 00:06:39.617 "max_queue_depth": 128, 00:06:39.617 "max_io_qpairs_per_ctrlr": 127, 00:06:39.617 "in_capsule_data_size": 4096, 00:06:39.617 "max_io_size": 131072, 00:06:39.617 "io_unit_size": 131072, 00:06:39.617 "max_aq_depth": 128, 00:06:39.617 "num_shared_buffers": 511, 00:06:39.617 "buf_cache_size": 4294967295, 00:06:39.617 "dif_insert_or_strip": false, 00:06:39.617 "zcopy": false, 00:06:39.617 "c2h_success": true, 00:06:39.617 "sock_priority": 0, 00:06:39.617 "abort_timeout_sec": 1, 00:06:39.617 "ack_timeout": 0, 00:06:39.617 "data_wr_pool_size": 0 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "nbd", 00:06:39.617 "config": [] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "ublk", 00:06:39.617 "config": [] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "vhost_blk", 00:06:39.617 "config": [] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "scsi", 00:06:39.617 "config": null 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "iscsi", 00:06:39.617 "config": [ 00:06:39.617 { 00:06:39.617 "method": "iscsi_set_options", 00:06:39.617 "params": { 00:06:39.617 "node_base": "iqn.2016-06.io.spdk", 00:06:39.617 "max_sessions": 128, 00:06:39.617 "max_connections_per_session": 2, 00:06:39.617 "max_queue_depth": 64, 00:06:39.617 "default_time2wait": 2, 00:06:39.617 "default_time2retain": 20, 00:06:39.617 "first_burst_length": 8192, 00:06:39.617 "immediate_data": true, 00:06:39.617 "allow_duplicated_isid": false, 00:06:39.617 "error_recovery_level": 0, 00:06:39.617 "nop_timeout": 60, 00:06:39.617 "nop_in_interval": 30, 00:06:39.617 "disable_chap": false, 00:06:39.617 "require_chap": false, 00:06:39.617 "mutual_chap": false, 00:06:39.617 "chap_group": 0, 00:06:39.617 "max_large_datain_per_connection": 64, 00:06:39.617 "max_r2t_per_connection": 4, 00:06:39.617 "pdu_pool_size": 36864, 00:06:39.617 "immediate_data_pool_size": 16384, 00:06:39.617 "data_out_pool_size": 2048 00:06:39.617 } 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 }, 00:06:39.617 { 00:06:39.617 "subsystem": "vhost_scsi", 00:06:39.617 "config": [] 00:06:39.617 } 00:06:39.617 ] 00:06:39.617 } 00:06:39.617 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:39.617 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 595548 00:06:39.617 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 595548 ']' 00:06:39.617 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 595548 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 595548 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 595548' 00:06:39.618 killing process with pid 595548 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 595548 00:06:39.618 12:37:09 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 595548 00:06:39.877 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=595770 00:06:39.877 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:39.877 12:37:09 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 595770 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 595770 ']' 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 595770 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:45.160 12:37:14 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 595770 00:06:45.160 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:45.160 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:45.160 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 595770' 00:06:45.160 killing process with pid 595770 00:06:45.160 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 595770 00:06:45.160 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 595770 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:45.425 00:06:45.425 real 0m6.788s 00:06:45.425 user 0m6.436s 00:06:45.425 sys 0m0.646s 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:45.425 ************************************ 00:06:45.425 END TEST skip_rpc_with_json 00:06:45.425 ************************************ 00:06:45.425 12:37:15 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:45.425 ************************************ 00:06:45.425 START TEST skip_rpc_with_delay 00:06:45.425 ************************************ 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:45.425 [2024-11-28 12:37:15.436226] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:45.425 00:06:45.425 real 0m0.045s 00:06:45.425 user 0m0.022s 00:06:45.425 sys 0m0.023s 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:45.425 12:37:15 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:06:45.425 ************************************ 00:06:45.425 END TEST skip_rpc_with_delay 00:06:45.425 ************************************ 00:06:45.425 12:37:15 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:06:45.425 12:37:15 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:45.425 12:37:15 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:45.425 12:37:15 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:45.425 ************************************ 00:06:45.425 START TEST exit_on_failed_rpc_init 00:06:45.425 ************************************ 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=596638 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 596638 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 596638 ']' 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.425 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:45.425 12:37:15 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:45.685 [2024-11-28 12:37:15.557659] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:45.685 [2024-11-28 12:37:15.557716] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid596638 ] 00:06:45.685 [2024-11-28 12:37:15.696999] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:45.685 [2024-11-28 12:37:15.727311] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.685 [2024-11-28 12:37:15.751465] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:46.623 [2024-11-28 12:37:16.449536] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:46.623 [2024-11-28 12:37:16.449608] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid596659 ] 00:06:46.623 [2024-11-28 12:37:16.587067] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:46.623 [2024-11-28 12:37:16.621015] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.623 [2024-11-28 12:37:16.644745] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:46.623 [2024-11-28 12:37:16.644823] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:46.623 [2024-11-28 12:37:16.644836] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:46.623 [2024-11-28 12:37:16.644844] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 596638 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 596638 ']' 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 596638 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 596638 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 596638' 00:06:46.623 killing process with pid 596638 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 596638 00:06:46.623 12:37:16 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 596638 00:06:47.191 00:06:47.191 real 0m1.498s 00:06:47.191 user 0m1.558s 00:06:47.191 sys 0m0.419s 00:06:47.191 12:37:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.191 12:37:17 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:06:47.191 ************************************ 00:06:47.191 END TEST exit_on_failed_rpc_init 00:06:47.191 ************************************ 00:06:47.191 12:37:17 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:47.191 00:06:47.191 real 0m14.212s 00:06:47.191 user 0m13.270s 00:06:47.191 sys 0m1.688s 00:06:47.191 12:37:17 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.191 12:37:17 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:47.191 ************************************ 00:06:47.191 END TEST skip_rpc 00:06:47.191 ************************************ 00:06:47.191 12:37:17 -- spdk/autotest.sh@158 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:47.191 12:37:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.191 12:37:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.191 12:37:17 -- common/autotest_common.sh@10 -- # set +x 00:06:47.191 ************************************ 00:06:47.191 START TEST rpc_client 00:06:47.191 ************************************ 00:06:47.191 12:37:17 rpc_client -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:47.191 * Looking for test storage... 00:06:47.191 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:47.191 12:37:17 rpc_client -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:47.191 12:37:17 rpc_client -- common/autotest_common.sh@1693 -- # lcov --version 00:06:47.191 12:37:17 rpc_client -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@345 -- # : 1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@353 -- # local d=1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@355 -- # echo 1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@353 -- # local d=2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@355 -- # echo 2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.449 12:37:17 rpc_client -- scripts/common.sh@368 -- # return 0 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:47.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.449 --rc genhtml_branch_coverage=1 00:06:47.449 --rc genhtml_function_coverage=1 00:06:47.449 --rc genhtml_legend=1 00:06:47.449 --rc geninfo_all_blocks=1 00:06:47.449 --rc geninfo_unexecuted_blocks=1 00:06:47.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.449 ' 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:47.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.449 --rc genhtml_branch_coverage=1 00:06:47.449 --rc genhtml_function_coverage=1 00:06:47.449 --rc genhtml_legend=1 00:06:47.449 --rc geninfo_all_blocks=1 00:06:47.449 --rc geninfo_unexecuted_blocks=1 00:06:47.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.449 ' 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:47.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.449 --rc genhtml_branch_coverage=1 00:06:47.449 --rc genhtml_function_coverage=1 00:06:47.449 --rc genhtml_legend=1 00:06:47.449 --rc geninfo_all_blocks=1 00:06:47.449 --rc geninfo_unexecuted_blocks=1 00:06:47.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.449 ' 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:47.449 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.449 --rc genhtml_branch_coverage=1 00:06:47.449 --rc genhtml_function_coverage=1 00:06:47.449 --rc genhtml_legend=1 00:06:47.449 --rc geninfo_all_blocks=1 00:06:47.449 --rc geninfo_unexecuted_blocks=1 00:06:47.449 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.449 ' 00:06:47.449 12:37:17 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:47.449 OK 00:06:47.449 12:37:17 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:47.449 00:06:47.449 real 0m0.218s 00:06:47.449 user 0m0.118s 00:06:47.449 sys 0m0.115s 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.449 12:37:17 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:06:47.449 ************************************ 00:06:47.449 END TEST rpc_client 00:06:47.449 ************************************ 00:06:47.449 12:37:17 -- spdk/autotest.sh@159 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:47.450 12:37:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.450 12:37:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.450 12:37:17 -- common/autotest_common.sh@10 -- # set +x 00:06:47.450 ************************************ 00:06:47.450 START TEST json_config 00:06:47.450 ************************************ 00:06:47.450 12:37:17 json_config -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:47.450 12:37:17 json_config -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:47.450 12:37:17 json_config -- common/autotest_common.sh@1693 -- # lcov --version 00:06:47.450 12:37:17 json_config -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:47.708 12:37:17 json_config -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:47.708 12:37:17 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.708 12:37:17 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.708 12:37:17 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.709 12:37:17 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.709 12:37:17 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.709 12:37:17 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.709 12:37:17 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.709 12:37:17 json_config -- scripts/common.sh@344 -- # case "$op" in 00:06:47.709 12:37:17 json_config -- scripts/common.sh@345 -- # : 1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.709 12:37:17 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.709 12:37:17 json_config -- scripts/common.sh@365 -- # decimal 1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@353 -- # local d=1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.709 12:37:17 json_config -- scripts/common.sh@355 -- # echo 1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.709 12:37:17 json_config -- scripts/common.sh@366 -- # decimal 2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@353 -- # local d=2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.709 12:37:17 json_config -- scripts/common.sh@355 -- # echo 2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.709 12:37:17 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.709 12:37:17 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.709 12:37:17 json_config -- scripts/common.sh@368 -- # return 0 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:47.709 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.709 --rc genhtml_branch_coverage=1 00:06:47.709 --rc genhtml_function_coverage=1 00:06:47.709 --rc genhtml_legend=1 00:06:47.709 --rc geninfo_all_blocks=1 00:06:47.709 --rc geninfo_unexecuted_blocks=1 00:06:47.709 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.709 ' 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:47.709 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.709 --rc genhtml_branch_coverage=1 00:06:47.709 --rc genhtml_function_coverage=1 00:06:47.709 --rc genhtml_legend=1 00:06:47.709 --rc geninfo_all_blocks=1 00:06:47.709 --rc geninfo_unexecuted_blocks=1 00:06:47.709 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.709 ' 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:47.709 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.709 --rc genhtml_branch_coverage=1 00:06:47.709 --rc genhtml_function_coverage=1 00:06:47.709 --rc genhtml_legend=1 00:06:47.709 --rc geninfo_all_blocks=1 00:06:47.709 --rc geninfo_unexecuted_blocks=1 00:06:47.709 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.709 ' 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:47.709 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.709 --rc genhtml_branch_coverage=1 00:06:47.709 --rc genhtml_function_coverage=1 00:06:47.709 --rc genhtml_legend=1 00:06:47.709 --rc geninfo_all_blocks=1 00:06:47.709 --rc geninfo_unexecuted_blocks=1 00:06:47.709 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.709 ' 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@7 -- # uname -s 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809f3706-e051-e711-906e-0017a4403562 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=809f3706-e051-e711-906e-0017a4403562 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:47.709 12:37:17 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:06:47.709 12:37:17 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:47.709 12:37:17 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:47.709 12:37:17 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:47.709 12:37:17 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.709 12:37:17 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.709 12:37:17 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.709 12:37:17 json_config -- paths/export.sh@5 -- # export PATH 00:06:47.709 12:37:17 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@51 -- # : 0 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:47.709 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:47.709 12:37:17 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:47.709 WARNING: No tests are enabled so not running JSON configuration tests 00:06:47.709 12:37:17 json_config -- json_config/json_config.sh@28 -- # exit 0 00:06:47.709 00:06:47.709 real 0m0.186s 00:06:47.709 user 0m0.112s 00:06:47.709 sys 0m0.081s 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:47.709 12:37:17 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:47.709 ************************************ 00:06:47.709 END TEST json_config 00:06:47.709 ************************************ 00:06:47.709 12:37:17 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:47.709 12:37:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:47.709 12:37:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:47.709 12:37:17 -- common/autotest_common.sh@10 -- # set +x 00:06:47.709 ************************************ 00:06:47.709 START TEST json_config_extra_key 00:06:47.709 ************************************ 00:06:47.709 12:37:17 json_config_extra_key -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:47.709 12:37:17 json_config_extra_key -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:47.709 12:37:17 json_config_extra_key -- common/autotest_common.sh@1693 -- # lcov --version 00:06:47.709 12:37:17 json_config_extra_key -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:47.968 12:37:17 json_config_extra_key -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:47.968 12:37:17 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:06:47.968 12:37:17 json_config_extra_key -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:47.968 12:37:17 json_config_extra_key -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:47.968 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.968 --rc genhtml_branch_coverage=1 00:06:47.968 --rc genhtml_function_coverage=1 00:06:47.968 --rc genhtml_legend=1 00:06:47.968 --rc geninfo_all_blocks=1 00:06:47.968 --rc geninfo_unexecuted_blocks=1 00:06:47.968 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.968 ' 00:06:47.968 12:37:17 json_config_extra_key -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:47.968 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.968 --rc genhtml_branch_coverage=1 00:06:47.968 --rc genhtml_function_coverage=1 00:06:47.968 --rc genhtml_legend=1 00:06:47.969 --rc geninfo_all_blocks=1 00:06:47.969 --rc geninfo_unexecuted_blocks=1 00:06:47.969 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.969 ' 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:47.969 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.969 --rc genhtml_branch_coverage=1 00:06:47.969 --rc genhtml_function_coverage=1 00:06:47.969 --rc genhtml_legend=1 00:06:47.969 --rc geninfo_all_blocks=1 00:06:47.969 --rc geninfo_unexecuted_blocks=1 00:06:47.969 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.969 ' 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:47.969 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:47.969 --rc genhtml_branch_coverage=1 00:06:47.969 --rc genhtml_function_coverage=1 00:06:47.969 --rc genhtml_legend=1 00:06:47.969 --rc geninfo_all_blocks=1 00:06:47.969 --rc geninfo_unexecuted_blocks=1 00:06:47.969 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:47.969 ' 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809f3706-e051-e711-906e-0017a4403562 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=809f3706-e051-e711-906e-0017a4403562 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:47.969 12:37:17 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:06:47.969 12:37:17 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:47.969 12:37:17 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:47.969 12:37:17 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:47.969 12:37:17 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.969 12:37:17 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.969 12:37:17 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.969 12:37:17 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:47.969 12:37:17 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:06:47.969 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:06:47.969 12:37:17 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:47.969 INFO: launching applications... 00:06:47.969 12:37:17 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=597017 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:47.969 Waiting for target to run... 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 597017 /var/tmp/spdk_tgt.sock 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 597017 ']' 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:47.969 12:37:17 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:47.969 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:47.969 12:37:17 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:47.969 [2024-11-28 12:37:17.934215] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:47.969 [2024-11-28 12:37:17.934298] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597017 ] 00:06:48.536 [2024-11-28 12:37:18.444272] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:48.536 [2024-11-28 12:37:18.480683] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.536 [2024-11-28 12:37:18.499479] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.794 12:37:18 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:48.794 12:37:18 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:48.794 00:06:48.794 12:37:18 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:48.794 INFO: shutting down applications... 00:06:48.794 12:37:18 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 597017 ]] 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 597017 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 597017 00:06:48.794 12:37:18 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 597017 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:49.362 12:37:19 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:49.362 SPDK target shutdown done 00:06:49.362 12:37:19 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:49.362 Success 00:06:49.362 00:06:49.362 real 0m1.594s 00:06:49.362 user 0m1.054s 00:06:49.362 sys 0m0.615s 00:06:49.362 12:37:19 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:49.362 12:37:19 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:49.362 ************************************ 00:06:49.362 END TEST json_config_extra_key 00:06:49.362 ************************************ 00:06:49.362 12:37:19 -- spdk/autotest.sh@161 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:49.362 12:37:19 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:49.362 12:37:19 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:49.362 12:37:19 -- common/autotest_common.sh@10 -- # set +x 00:06:49.362 ************************************ 00:06:49.362 START TEST alias_rpc 00:06:49.362 ************************************ 00:06:49.362 12:37:19 alias_rpc -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:49.362 * Looking for test storage... 00:06:49.621 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1693 -- # lcov --version 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@345 -- # : 1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:49.621 12:37:19 alias_rpc -- scripts/common.sh@368 -- # return 0 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:49.621 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.621 --rc genhtml_branch_coverage=1 00:06:49.621 --rc genhtml_function_coverage=1 00:06:49.621 --rc genhtml_legend=1 00:06:49.621 --rc geninfo_all_blocks=1 00:06:49.621 --rc geninfo_unexecuted_blocks=1 00:06:49.621 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.621 ' 00:06:49.621 12:37:19 alias_rpc -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:49.621 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.621 --rc genhtml_branch_coverage=1 00:06:49.622 --rc genhtml_function_coverage=1 00:06:49.622 --rc genhtml_legend=1 00:06:49.622 --rc geninfo_all_blocks=1 00:06:49.622 --rc geninfo_unexecuted_blocks=1 00:06:49.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.622 ' 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:49.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.622 --rc genhtml_branch_coverage=1 00:06:49.622 --rc genhtml_function_coverage=1 00:06:49.622 --rc genhtml_legend=1 00:06:49.622 --rc geninfo_all_blocks=1 00:06:49.622 --rc geninfo_unexecuted_blocks=1 00:06:49.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.622 ' 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:49.622 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:49.622 --rc genhtml_branch_coverage=1 00:06:49.622 --rc genhtml_function_coverage=1 00:06:49.622 --rc genhtml_legend=1 00:06:49.622 --rc geninfo_all_blocks=1 00:06:49.622 --rc geninfo_unexecuted_blocks=1 00:06:49.622 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:49.622 ' 00:06:49.622 12:37:19 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:49.622 12:37:19 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=597400 00:06:49.622 12:37:19 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 597400 00:06:49.622 12:37:19 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 597400 ']' 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:49.622 12:37:19 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:49.622 [2024-11-28 12:37:19.604536] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:49.622 [2024-11-28 12:37:19.604607] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597400 ] 00:06:49.622 [2024-11-28 12:37:19.740955] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:49.881 [2024-11-28 12:37:19.775309] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.881 [2024-11-28 12:37:19.799448] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.449 12:37:20 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:50.449 12:37:20 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:50.449 12:37:20 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:50.707 12:37:20 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 597400 00:06:50.707 12:37:20 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 597400 ']' 00:06:50.707 12:37:20 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 597400 00:06:50.707 12:37:20 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:06:50.707 12:37:20 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:50.707 12:37:20 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 597400 00:06:50.708 12:37:20 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:50.708 12:37:20 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:50.708 12:37:20 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 597400' 00:06:50.708 killing process with pid 597400 00:06:50.708 12:37:20 alias_rpc -- common/autotest_common.sh@973 -- # kill 597400 00:06:50.708 12:37:20 alias_rpc -- common/autotest_common.sh@978 -- # wait 597400 00:06:50.966 00:06:50.966 real 0m1.638s 00:06:50.966 user 0m1.664s 00:06:50.966 sys 0m0.483s 00:06:50.966 12:37:21 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:50.966 12:37:21 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.966 ************************************ 00:06:50.966 END TEST alias_rpc 00:06:50.966 ************************************ 00:06:50.966 12:37:21 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:06:50.966 12:37:21 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:50.966 12:37:21 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:50.966 12:37:21 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:50.966 12:37:21 -- common/autotest_common.sh@10 -- # set +x 00:06:51.225 ************************************ 00:06:51.225 START TEST spdkcli_tcp 00:06:51.225 ************************************ 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:51.225 * Looking for test storage... 00:06:51.225 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lcov --version 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:51.225 12:37:21 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:51.225 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:51.225 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.225 --rc genhtml_branch_coverage=1 00:06:51.225 --rc genhtml_function_coverage=1 00:06:51.225 --rc genhtml_legend=1 00:06:51.225 --rc geninfo_all_blocks=1 00:06:51.225 --rc geninfo_unexecuted_blocks=1 00:06:51.225 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.225 ' 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:51.226 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.226 --rc genhtml_branch_coverage=1 00:06:51.226 --rc genhtml_function_coverage=1 00:06:51.226 --rc genhtml_legend=1 00:06:51.226 --rc geninfo_all_blocks=1 00:06:51.226 --rc geninfo_unexecuted_blocks=1 00:06:51.226 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.226 ' 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:51.226 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.226 --rc genhtml_branch_coverage=1 00:06:51.226 --rc genhtml_function_coverage=1 00:06:51.226 --rc genhtml_legend=1 00:06:51.226 --rc geninfo_all_blocks=1 00:06:51.226 --rc geninfo_unexecuted_blocks=1 00:06:51.226 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.226 ' 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:51.226 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:51.226 --rc genhtml_branch_coverage=1 00:06:51.226 --rc genhtml_function_coverage=1 00:06:51.226 --rc genhtml_legend=1 00:06:51.226 --rc geninfo_all_blocks=1 00:06:51.226 --rc geninfo_unexecuted_blocks=1 00:06:51.226 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:51.226 ' 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=597645 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:51.226 12:37:21 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 597645 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 597645 ']' 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:51.226 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:51.226 12:37:21 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:51.226 [2024-11-28 12:37:21.299280] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:51.226 [2024-11-28 12:37:21.299345] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597645 ] 00:06:51.484 [2024-11-28 12:37:21.435492] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:51.484 [2024-11-28 12:37:21.469984] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:51.484 [2024-11-28 12:37:21.494999] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:51.484 [2024-11-28 12:37:21.495001] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.050 12:37:22 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:52.050 12:37:22 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:06:52.050 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=597810 00:06:52.050 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:52.050 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:52.308 [ 00:06:52.308 "spdk_get_version", 00:06:52.308 "rpc_get_methods", 00:06:52.308 "notify_get_notifications", 00:06:52.308 "notify_get_types", 00:06:52.308 "trace_get_info", 00:06:52.308 "trace_get_tpoint_group_mask", 00:06:52.308 "trace_disable_tpoint_group", 00:06:52.308 "trace_enable_tpoint_group", 00:06:52.308 "trace_clear_tpoint_mask", 00:06:52.308 "trace_set_tpoint_mask", 00:06:52.308 "fsdev_set_opts", 00:06:52.309 "fsdev_get_opts", 00:06:52.309 "framework_get_pci_devices", 00:06:52.309 "framework_get_config", 00:06:52.309 "framework_get_subsystems", 00:06:52.309 "vfu_tgt_set_base_path", 00:06:52.309 "keyring_get_keys", 00:06:52.309 "iobuf_get_stats", 00:06:52.309 "iobuf_set_options", 00:06:52.309 "sock_get_default_impl", 00:06:52.309 "sock_set_default_impl", 00:06:52.309 "sock_impl_set_options", 00:06:52.309 "sock_impl_get_options", 00:06:52.309 "vmd_rescan", 00:06:52.309 "vmd_remove_device", 00:06:52.309 "vmd_enable", 00:06:52.309 "accel_get_stats", 00:06:52.309 "accel_set_options", 00:06:52.309 "accel_set_driver", 00:06:52.309 "accel_crypto_key_destroy", 00:06:52.309 "accel_crypto_keys_get", 00:06:52.309 "accel_crypto_key_create", 00:06:52.309 "accel_assign_opc", 00:06:52.309 "accel_get_module_info", 00:06:52.309 "accel_get_opc_assignments", 00:06:52.309 "bdev_get_histogram", 00:06:52.309 "bdev_enable_histogram", 00:06:52.309 "bdev_set_qos_limit", 00:06:52.309 "bdev_set_qd_sampling_period", 00:06:52.309 "bdev_get_bdevs", 00:06:52.309 "bdev_reset_iostat", 00:06:52.309 "bdev_get_iostat", 00:06:52.309 "bdev_examine", 00:06:52.309 "bdev_wait_for_examine", 00:06:52.309 "bdev_set_options", 00:06:52.309 "scsi_get_devices", 00:06:52.309 "thread_set_cpumask", 00:06:52.309 "scheduler_set_options", 00:06:52.309 "framework_get_governor", 00:06:52.309 "framework_get_scheduler", 00:06:52.309 "framework_set_scheduler", 00:06:52.309 "framework_get_reactors", 00:06:52.309 "thread_get_io_channels", 00:06:52.309 "thread_get_pollers", 00:06:52.309 "thread_get_stats", 00:06:52.309 "framework_monitor_context_switch", 00:06:52.309 "spdk_kill_instance", 00:06:52.309 "log_enable_timestamps", 00:06:52.309 "log_get_flags", 00:06:52.309 "log_clear_flag", 00:06:52.309 "log_set_flag", 00:06:52.309 "log_get_level", 00:06:52.309 "log_set_level", 00:06:52.309 "log_get_print_level", 00:06:52.309 "log_set_print_level", 00:06:52.309 "framework_enable_cpumask_locks", 00:06:52.309 "framework_disable_cpumask_locks", 00:06:52.309 "framework_wait_init", 00:06:52.309 "framework_start_init", 00:06:52.309 "virtio_blk_create_transport", 00:06:52.309 "virtio_blk_get_transports", 00:06:52.309 "vhost_controller_set_coalescing", 00:06:52.309 "vhost_get_controllers", 00:06:52.309 "vhost_delete_controller", 00:06:52.309 "vhost_create_blk_controller", 00:06:52.309 "vhost_scsi_controller_remove_target", 00:06:52.309 "vhost_scsi_controller_add_target", 00:06:52.309 "vhost_start_scsi_controller", 00:06:52.309 "vhost_create_scsi_controller", 00:06:52.309 "ublk_recover_disk", 00:06:52.309 "ublk_get_disks", 00:06:52.309 "ublk_stop_disk", 00:06:52.309 "ublk_start_disk", 00:06:52.309 "ublk_destroy_target", 00:06:52.309 "ublk_create_target", 00:06:52.309 "nbd_get_disks", 00:06:52.309 "nbd_stop_disk", 00:06:52.309 "nbd_start_disk", 00:06:52.309 "env_dpdk_get_mem_stats", 00:06:52.309 "nvmf_stop_mdns_prr", 00:06:52.309 "nvmf_publish_mdns_prr", 00:06:52.309 "nvmf_subsystem_get_listeners", 00:06:52.309 "nvmf_subsystem_get_qpairs", 00:06:52.309 "nvmf_subsystem_get_controllers", 00:06:52.309 "nvmf_get_stats", 00:06:52.309 "nvmf_get_transports", 00:06:52.309 "nvmf_create_transport", 00:06:52.309 "nvmf_get_targets", 00:06:52.309 "nvmf_delete_target", 00:06:52.309 "nvmf_create_target", 00:06:52.309 "nvmf_subsystem_allow_any_host", 00:06:52.309 "nvmf_subsystem_set_keys", 00:06:52.309 "nvmf_subsystem_remove_host", 00:06:52.309 "nvmf_subsystem_add_host", 00:06:52.309 "nvmf_ns_remove_host", 00:06:52.309 "nvmf_ns_add_host", 00:06:52.309 "nvmf_subsystem_remove_ns", 00:06:52.309 "nvmf_subsystem_set_ns_ana_group", 00:06:52.309 "nvmf_subsystem_add_ns", 00:06:52.309 "nvmf_subsystem_listener_set_ana_state", 00:06:52.309 "nvmf_discovery_get_referrals", 00:06:52.309 "nvmf_discovery_remove_referral", 00:06:52.309 "nvmf_discovery_add_referral", 00:06:52.309 "nvmf_subsystem_remove_listener", 00:06:52.309 "nvmf_subsystem_add_listener", 00:06:52.309 "nvmf_delete_subsystem", 00:06:52.309 "nvmf_create_subsystem", 00:06:52.309 "nvmf_get_subsystems", 00:06:52.309 "nvmf_set_crdt", 00:06:52.309 "nvmf_set_config", 00:06:52.309 "nvmf_set_max_subsystems", 00:06:52.309 "iscsi_get_histogram", 00:06:52.309 "iscsi_enable_histogram", 00:06:52.309 "iscsi_set_options", 00:06:52.309 "iscsi_get_auth_groups", 00:06:52.309 "iscsi_auth_group_remove_secret", 00:06:52.309 "iscsi_auth_group_add_secret", 00:06:52.309 "iscsi_delete_auth_group", 00:06:52.309 "iscsi_create_auth_group", 00:06:52.309 "iscsi_set_discovery_auth", 00:06:52.309 "iscsi_get_options", 00:06:52.309 "iscsi_target_node_request_logout", 00:06:52.309 "iscsi_target_node_set_redirect", 00:06:52.309 "iscsi_target_node_set_auth", 00:06:52.309 "iscsi_target_node_add_lun", 00:06:52.309 "iscsi_get_stats", 00:06:52.309 "iscsi_get_connections", 00:06:52.309 "iscsi_portal_group_set_auth", 00:06:52.309 "iscsi_start_portal_group", 00:06:52.309 "iscsi_delete_portal_group", 00:06:52.309 "iscsi_create_portal_group", 00:06:52.309 "iscsi_get_portal_groups", 00:06:52.309 "iscsi_delete_target_node", 00:06:52.309 "iscsi_target_node_remove_pg_ig_maps", 00:06:52.309 "iscsi_target_node_add_pg_ig_maps", 00:06:52.309 "iscsi_create_target_node", 00:06:52.309 "iscsi_get_target_nodes", 00:06:52.309 "iscsi_delete_initiator_group", 00:06:52.309 "iscsi_initiator_group_remove_initiators", 00:06:52.309 "iscsi_initiator_group_add_initiators", 00:06:52.309 "iscsi_create_initiator_group", 00:06:52.309 "iscsi_get_initiator_groups", 00:06:52.309 "fsdev_aio_delete", 00:06:52.309 "fsdev_aio_create", 00:06:52.309 "keyring_linux_set_options", 00:06:52.309 "keyring_file_remove_key", 00:06:52.309 "keyring_file_add_key", 00:06:52.309 "vfu_virtio_create_fs_endpoint", 00:06:52.309 "vfu_virtio_create_scsi_endpoint", 00:06:52.310 "vfu_virtio_scsi_remove_target", 00:06:52.310 "vfu_virtio_scsi_add_target", 00:06:52.310 "vfu_virtio_create_blk_endpoint", 00:06:52.310 "vfu_virtio_delete_endpoint", 00:06:52.310 "iaa_scan_accel_module", 00:06:52.310 "dsa_scan_accel_module", 00:06:52.310 "ioat_scan_accel_module", 00:06:52.310 "accel_error_inject_error", 00:06:52.310 "bdev_iscsi_delete", 00:06:52.310 "bdev_iscsi_create", 00:06:52.310 "bdev_iscsi_set_options", 00:06:52.310 "bdev_virtio_attach_controller", 00:06:52.310 "bdev_virtio_scsi_get_devices", 00:06:52.310 "bdev_virtio_detach_controller", 00:06:52.310 "bdev_virtio_blk_set_hotplug", 00:06:52.310 "bdev_ftl_set_property", 00:06:52.310 "bdev_ftl_get_properties", 00:06:52.310 "bdev_ftl_get_stats", 00:06:52.310 "bdev_ftl_unmap", 00:06:52.310 "bdev_ftl_unload", 00:06:52.310 "bdev_ftl_delete", 00:06:52.310 "bdev_ftl_load", 00:06:52.310 "bdev_ftl_create", 00:06:52.310 "bdev_aio_delete", 00:06:52.310 "bdev_aio_rescan", 00:06:52.310 "bdev_aio_create", 00:06:52.310 "blobfs_create", 00:06:52.310 "blobfs_detect", 00:06:52.310 "blobfs_set_cache_size", 00:06:52.310 "bdev_zone_block_delete", 00:06:52.310 "bdev_zone_block_create", 00:06:52.310 "bdev_delay_delete", 00:06:52.310 "bdev_delay_create", 00:06:52.310 "bdev_delay_update_latency", 00:06:52.310 "bdev_split_delete", 00:06:52.310 "bdev_split_create", 00:06:52.310 "bdev_error_inject_error", 00:06:52.310 "bdev_error_delete", 00:06:52.310 "bdev_error_create", 00:06:52.310 "bdev_raid_set_options", 00:06:52.310 "bdev_raid_remove_base_bdev", 00:06:52.310 "bdev_raid_add_base_bdev", 00:06:52.310 "bdev_raid_delete", 00:06:52.310 "bdev_raid_create", 00:06:52.310 "bdev_raid_get_bdevs", 00:06:52.310 "bdev_lvol_set_parent_bdev", 00:06:52.310 "bdev_lvol_set_parent", 00:06:52.310 "bdev_lvol_check_shallow_copy", 00:06:52.310 "bdev_lvol_start_shallow_copy", 00:06:52.310 "bdev_lvol_grow_lvstore", 00:06:52.310 "bdev_lvol_get_lvols", 00:06:52.310 "bdev_lvol_get_lvstores", 00:06:52.310 "bdev_lvol_delete", 00:06:52.310 "bdev_lvol_set_read_only", 00:06:52.310 "bdev_lvol_resize", 00:06:52.310 "bdev_lvol_decouple_parent", 00:06:52.310 "bdev_lvol_inflate", 00:06:52.310 "bdev_lvol_rename", 00:06:52.310 "bdev_lvol_clone_bdev", 00:06:52.310 "bdev_lvol_clone", 00:06:52.310 "bdev_lvol_snapshot", 00:06:52.310 "bdev_lvol_create", 00:06:52.310 "bdev_lvol_delete_lvstore", 00:06:52.310 "bdev_lvol_rename_lvstore", 00:06:52.310 "bdev_lvol_create_lvstore", 00:06:52.310 "bdev_passthru_delete", 00:06:52.310 "bdev_passthru_create", 00:06:52.310 "bdev_nvme_cuse_unregister", 00:06:52.310 "bdev_nvme_cuse_register", 00:06:52.310 "bdev_opal_new_user", 00:06:52.310 "bdev_opal_set_lock_state", 00:06:52.310 "bdev_opal_delete", 00:06:52.310 "bdev_opal_get_info", 00:06:52.310 "bdev_opal_create", 00:06:52.310 "bdev_nvme_opal_revert", 00:06:52.310 "bdev_nvme_opal_init", 00:06:52.310 "bdev_nvme_send_cmd", 00:06:52.310 "bdev_nvme_set_keys", 00:06:52.310 "bdev_nvme_get_path_iostat", 00:06:52.310 "bdev_nvme_get_mdns_discovery_info", 00:06:52.310 "bdev_nvme_stop_mdns_discovery", 00:06:52.310 "bdev_nvme_start_mdns_discovery", 00:06:52.310 "bdev_nvme_set_multipath_policy", 00:06:52.310 "bdev_nvme_set_preferred_path", 00:06:52.310 "bdev_nvme_get_io_paths", 00:06:52.310 "bdev_nvme_remove_error_injection", 00:06:52.310 "bdev_nvme_add_error_injection", 00:06:52.310 "bdev_nvme_get_discovery_info", 00:06:52.310 "bdev_nvme_stop_discovery", 00:06:52.310 "bdev_nvme_start_discovery", 00:06:52.310 "bdev_nvme_get_controller_health_info", 00:06:52.310 "bdev_nvme_disable_controller", 00:06:52.310 "bdev_nvme_enable_controller", 00:06:52.310 "bdev_nvme_reset_controller", 00:06:52.310 "bdev_nvme_get_transport_statistics", 00:06:52.310 "bdev_nvme_apply_firmware", 00:06:52.310 "bdev_nvme_detach_controller", 00:06:52.310 "bdev_nvme_get_controllers", 00:06:52.310 "bdev_nvme_attach_controller", 00:06:52.310 "bdev_nvme_set_hotplug", 00:06:52.310 "bdev_nvme_set_options", 00:06:52.310 "bdev_null_resize", 00:06:52.310 "bdev_null_delete", 00:06:52.310 "bdev_null_create", 00:06:52.310 "bdev_malloc_delete", 00:06:52.310 "bdev_malloc_create" 00:06:52.310 ] 00:06:52.310 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:52.310 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:52.310 12:37:22 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 597645 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 597645 ']' 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 597645 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:52.310 12:37:22 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 597645 00:06:52.568 12:37:22 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:52.568 12:37:22 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:52.568 12:37:22 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 597645' 00:06:52.568 killing process with pid 597645 00:06:52.568 12:37:22 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 597645 00:06:52.568 12:37:22 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 597645 00:06:52.827 00:06:52.827 real 0m1.648s 00:06:52.827 user 0m2.912s 00:06:52.827 sys 0m0.505s 00:06:52.827 12:37:22 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:52.827 12:37:22 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:52.827 ************************************ 00:06:52.827 END TEST spdkcli_tcp 00:06:52.827 ************************************ 00:06:52.827 12:37:22 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:52.827 12:37:22 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:52.827 12:37:22 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:52.827 12:37:22 -- common/autotest_common.sh@10 -- # set +x 00:06:52.827 ************************************ 00:06:52.827 START TEST dpdk_mem_utility 00:06:52.827 ************************************ 00:06:52.827 12:37:22 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:52.827 * Looking for test storage... 00:06:52.827 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:52.827 12:37:22 dpdk_mem_utility -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:52.827 12:37:22 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lcov --version 00:06:52.827 12:37:22 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:53.086 12:37:22 dpdk_mem_utility -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:06:53.086 12:37:22 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:53.086 12:37:23 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:53.086 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.086 --rc genhtml_branch_coverage=1 00:06:53.086 --rc genhtml_function_coverage=1 00:06:53.086 --rc genhtml_legend=1 00:06:53.086 --rc geninfo_all_blocks=1 00:06:53.086 --rc geninfo_unexecuted_blocks=1 00:06:53.086 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.086 ' 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:53.086 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.086 --rc genhtml_branch_coverage=1 00:06:53.086 --rc genhtml_function_coverage=1 00:06:53.086 --rc genhtml_legend=1 00:06:53.086 --rc geninfo_all_blocks=1 00:06:53.086 --rc geninfo_unexecuted_blocks=1 00:06:53.086 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.086 ' 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:53.086 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.086 --rc genhtml_branch_coverage=1 00:06:53.086 --rc genhtml_function_coverage=1 00:06:53.086 --rc genhtml_legend=1 00:06:53.086 --rc geninfo_all_blocks=1 00:06:53.086 --rc geninfo_unexecuted_blocks=1 00:06:53.086 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.086 ' 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:53.086 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:53.086 --rc genhtml_branch_coverage=1 00:06:53.086 --rc genhtml_function_coverage=1 00:06:53.086 --rc genhtml_legend=1 00:06:53.086 --rc geninfo_all_blocks=1 00:06:53.086 --rc geninfo_unexecuted_blocks=1 00:06:53.086 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:53.086 ' 00:06:53.086 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:53.086 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=597907 00:06:53.086 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 597907 00:06:53.086 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 597907 ']' 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.086 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.086 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:53.086 [2024-11-28 12:37:23.037971] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:53.086 [2024-11-28 12:37:23.038055] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid597907 ] 00:06:53.086 [2024-11-28 12:37:23.178060] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:53.086 [2024-11-28 12:37:23.209036] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.344 [2024-11-28 12:37:23.232983] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.909 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:53.909 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:06:53.909 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:53.909 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:53.909 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:53.909 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:53.909 { 00:06:53.909 "filename": "/tmp/spdk_mem_dump.txt" 00:06:53.909 } 00:06:53.909 12:37:23 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:53.909 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:53.909 DPDK memory size 818.000000 MiB in 1 heap(s) 00:06:53.909 1 heaps totaling size 818.000000 MiB 00:06:53.909 size: 818.000000 MiB heap id: 0 00:06:53.909 end heaps---------- 00:06:53.909 9 mempools totaling size 603.782043 MiB 00:06:53.909 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:53.909 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:53.909 size: 100.555481 MiB name: bdev_io_597907 00:06:53.909 size: 50.003479 MiB name: msgpool_597907 00:06:53.909 size: 36.509338 MiB name: fsdev_io_597907 00:06:53.909 size: 21.763794 MiB name: PDU_Pool 00:06:53.909 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:53.909 size: 4.133484 MiB name: evtpool_597907 00:06:53.909 size: 0.026123 MiB name: Session_Pool 00:06:53.909 end mempools------- 00:06:53.909 6 memzones totaling size 4.142822 MiB 00:06:53.909 size: 1.000366 MiB name: RG_ring_0_597907 00:06:53.909 size: 1.000366 MiB name: RG_ring_1_597907 00:06:53.909 size: 1.000366 MiB name: RG_ring_4_597907 00:06:53.909 size: 1.000366 MiB name: RG_ring_5_597907 00:06:53.909 size: 0.125366 MiB name: RG_ring_2_597907 00:06:53.909 size: 0.015991 MiB name: RG_ring_3_597907 00:06:53.909 end memzones------- 00:06:53.909 12:37:23 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:53.909 heap id: 0 total size: 818.000000 MiB number of busy elements: 43 number of free elements: 15 00:06:53.909 list of free elements. size: 10.993225 MiB 00:06:53.909 element at address: 0x200019200000 with size: 0.999878 MiB 00:06:53.909 element at address: 0x200019400000 with size: 0.999878 MiB 00:06:53.909 element at address: 0x200000400000 with size: 0.998535 MiB 00:06:53.909 element at address: 0x200032000000 with size: 0.994446 MiB 00:06:53.909 element at address: 0x200008000000 with size: 0.959839 MiB 00:06:53.909 element at address: 0x200012c00000 with size: 0.944275 MiB 00:06:53.909 element at address: 0x200019600000 with size: 0.936584 MiB 00:06:53.909 element at address: 0x200000200000 with size: 0.858093 MiB 00:06:53.909 element at address: 0x20001ae00000 with size: 0.582886 MiB 00:06:53.909 element at address: 0x200000c00000 with size: 0.495422 MiB 00:06:53.909 element at address: 0x200003e00000 with size: 0.490723 MiB 00:06:53.909 element at address: 0x200019800000 with size: 0.485657 MiB 00:06:53.909 element at address: 0x200010600000 with size: 0.481934 MiB 00:06:53.909 element at address: 0x200028200000 with size: 0.410034 MiB 00:06:53.909 element at address: 0x200000800000 with size: 0.355042 MiB 00:06:53.909 list of standard malloc elements. size: 199.077881 MiB 00:06:53.909 element at address: 0x2000081fff80 with size: 132.000122 MiB 00:06:53.909 element at address: 0x200003ffff80 with size: 64.000122 MiB 00:06:53.909 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:53.909 element at address: 0x2000194fff80 with size: 1.000122 MiB 00:06:53.909 element at address: 0x2000196fff80 with size: 1.000122 MiB 00:06:53.909 element at address: 0x2000196eff00 with size: 0.062622 MiB 00:06:53.909 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:53.909 element at address: 0x2000196efdc0 with size: 0.000305 MiB 00:06:53.909 element at address: 0x2000002fbcc0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000003fdec0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000004ffb80 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20000085ae40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20000085b040 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20000085b100 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000008db3c0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000008db5c0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000008df880 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200000cff000 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200003e7da00 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200003e7dac0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200003efdd80 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000080fdd80 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20001067b600 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20001067b6c0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000106fb980 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200012cf1bc0 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000196efc40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000196efd00 with size: 0.000183 MiB 00:06:53.909 element at address: 0x2000198bc740 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20001ae95380 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20001ae95440 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200028268f80 with size: 0.000183 MiB 00:06:53.909 element at address: 0x200028269040 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20002826fc40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20002826fe40 with size: 0.000183 MiB 00:06:53.909 element at address: 0x20002826ff00 with size: 0.000183 MiB 00:06:53.909 list of memzone associated elements. size: 607.928894 MiB 00:06:53.909 element at address: 0x20001ae95500 with size: 211.416748 MiB 00:06:53.909 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:53.909 element at address: 0x20002826ffc0 with size: 157.562561 MiB 00:06:53.909 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:53.909 element at address: 0x200012df1e80 with size: 100.055054 MiB 00:06:53.909 associated memzone info: size: 100.054932 MiB name: MP_bdev_io_597907_0 00:06:53.909 element at address: 0x200000dff380 with size: 48.003052 MiB 00:06:53.909 associated memzone info: size: 48.002930 MiB name: MP_msgpool_597907_0 00:06:53.909 element at address: 0x2000107fdb80 with size: 36.008911 MiB 00:06:53.909 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_597907_0 00:06:53.909 element at address: 0x2000199be940 with size: 20.255554 MiB 00:06:53.909 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:53.909 element at address: 0x2000321feb40 with size: 18.005066 MiB 00:06:53.909 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:53.909 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:06:53.909 associated memzone info: size: 3.000122 MiB name: MP_evtpool_597907_0 00:06:53.909 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:06:53.909 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_597907 00:06:53.909 element at address: 0x2000002fbd80 with size: 1.008118 MiB 00:06:53.909 associated memzone info: size: 1.007996 MiB name: MP_evtpool_597907 00:06:53.909 element at address: 0x2000106fba40 with size: 1.008118 MiB 00:06:53.909 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:53.909 element at address: 0x2000198bc800 with size: 1.008118 MiB 00:06:53.909 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:53.909 element at address: 0x2000080fde40 with size: 1.008118 MiB 00:06:53.909 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:53.909 element at address: 0x200003efde40 with size: 1.008118 MiB 00:06:53.909 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:53.909 element at address: 0x200000cff180 with size: 1.000488 MiB 00:06:53.909 associated memzone info: size: 1.000366 MiB name: RG_ring_0_597907 00:06:53.909 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:06:53.909 associated memzone info: size: 1.000366 MiB name: RG_ring_1_597907 00:06:53.909 element at address: 0x200012cf1c80 with size: 1.000488 MiB 00:06:53.909 associated memzone info: size: 1.000366 MiB name: RG_ring_4_597907 00:06:53.909 element at address: 0x2000320fe940 with size: 1.000488 MiB 00:06:53.909 associated memzone info: size: 1.000366 MiB name: RG_ring_5_597907 00:06:53.909 element at address: 0x20000085b1c0 with size: 0.500488 MiB 00:06:53.909 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_597907 00:06:53.909 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:06:53.910 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_597907 00:06:53.910 element at address: 0x20001067b780 with size: 0.500488 MiB 00:06:53.910 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:53.910 element at address: 0x200003e7db80 with size: 0.500488 MiB 00:06:53.910 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:53.910 element at address: 0x20001987c540 with size: 0.250488 MiB 00:06:53.910 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:53.910 element at address: 0x2000002dbac0 with size: 0.125488 MiB 00:06:53.910 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_597907 00:06:53.910 element at address: 0x2000008df940 with size: 0.125488 MiB 00:06:53.910 associated memzone info: size: 0.125366 MiB name: RG_ring_2_597907 00:06:53.910 element at address: 0x2000080f5b80 with size: 0.031738 MiB 00:06:53.910 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:53.910 element at address: 0x200028269100 with size: 0.023743 MiB 00:06:53.910 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:53.910 element at address: 0x2000008db680 with size: 0.016113 MiB 00:06:53.910 associated memzone info: size: 0.015991 MiB name: RG_ring_3_597907 00:06:53.910 element at address: 0x20002826f240 with size: 0.002441 MiB 00:06:53.910 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:53.910 element at address: 0x2000004ffc40 with size: 0.000305 MiB 00:06:53.910 associated memzone info: size: 0.000183 MiB name: MP_msgpool_597907 00:06:53.910 element at address: 0x2000008db480 with size: 0.000305 MiB 00:06:53.910 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_597907 00:06:53.910 element at address: 0x20000085af00 with size: 0.000305 MiB 00:06:53.910 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_597907 00:06:53.910 element at address: 0x20002826fd00 with size: 0.000305 MiB 00:06:53.910 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:53.910 12:37:24 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:53.910 12:37:24 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 597907 00:06:53.910 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 597907 ']' 00:06:53.910 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 597907 00:06:53.910 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:06:53.910 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.910 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 597907 00:06:54.168 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:54.168 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:54.168 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 597907' 00:06:54.168 killing process with pid 597907 00:06:54.168 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 597907 00:06:54.168 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 597907 00:06:54.426 00:06:54.426 real 0m1.545s 00:06:54.426 user 0m1.526s 00:06:54.426 sys 0m0.454s 00:06:54.426 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:54.426 12:37:24 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:54.426 ************************************ 00:06:54.426 END TEST dpdk_mem_utility 00:06:54.426 ************************************ 00:06:54.426 12:37:24 -- spdk/autotest.sh@168 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:54.426 12:37:24 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:54.426 12:37:24 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.426 12:37:24 -- common/autotest_common.sh@10 -- # set +x 00:06:54.426 ************************************ 00:06:54.426 START TEST event 00:06:54.426 ************************************ 00:06:54.426 12:37:24 event -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:54.426 * Looking for test storage... 00:06:54.426 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:54.426 12:37:24 event -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:54.426 12:37:24 event -- common/autotest_common.sh@1693 -- # lcov --version 00:06:54.426 12:37:24 event -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:54.685 12:37:24 event -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:54.685 12:37:24 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:54.685 12:37:24 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:54.685 12:37:24 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:54.685 12:37:24 event -- scripts/common.sh@336 -- # IFS=.-: 00:06:54.685 12:37:24 event -- scripts/common.sh@336 -- # read -ra ver1 00:06:54.686 12:37:24 event -- scripts/common.sh@337 -- # IFS=.-: 00:06:54.686 12:37:24 event -- scripts/common.sh@337 -- # read -ra ver2 00:06:54.686 12:37:24 event -- scripts/common.sh@338 -- # local 'op=<' 00:06:54.686 12:37:24 event -- scripts/common.sh@340 -- # ver1_l=2 00:06:54.686 12:37:24 event -- scripts/common.sh@341 -- # ver2_l=1 00:06:54.686 12:37:24 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:54.686 12:37:24 event -- scripts/common.sh@344 -- # case "$op" in 00:06:54.686 12:37:24 event -- scripts/common.sh@345 -- # : 1 00:06:54.686 12:37:24 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:54.686 12:37:24 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:54.686 12:37:24 event -- scripts/common.sh@365 -- # decimal 1 00:06:54.686 12:37:24 event -- scripts/common.sh@353 -- # local d=1 00:06:54.686 12:37:24 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:54.686 12:37:24 event -- scripts/common.sh@355 -- # echo 1 00:06:54.686 12:37:24 event -- scripts/common.sh@365 -- # ver1[v]=1 00:06:54.686 12:37:24 event -- scripts/common.sh@366 -- # decimal 2 00:06:54.686 12:37:24 event -- scripts/common.sh@353 -- # local d=2 00:06:54.686 12:37:24 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:54.686 12:37:24 event -- scripts/common.sh@355 -- # echo 2 00:06:54.686 12:37:24 event -- scripts/common.sh@366 -- # ver2[v]=2 00:06:54.686 12:37:24 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:54.686 12:37:24 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:54.686 12:37:24 event -- scripts/common.sh@368 -- # return 0 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:54.686 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.686 --rc genhtml_branch_coverage=1 00:06:54.686 --rc genhtml_function_coverage=1 00:06:54.686 --rc genhtml_legend=1 00:06:54.686 --rc geninfo_all_blocks=1 00:06:54.686 --rc geninfo_unexecuted_blocks=1 00:06:54.686 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.686 ' 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:54.686 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.686 --rc genhtml_branch_coverage=1 00:06:54.686 --rc genhtml_function_coverage=1 00:06:54.686 --rc genhtml_legend=1 00:06:54.686 --rc geninfo_all_blocks=1 00:06:54.686 --rc geninfo_unexecuted_blocks=1 00:06:54.686 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.686 ' 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:54.686 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.686 --rc genhtml_branch_coverage=1 00:06:54.686 --rc genhtml_function_coverage=1 00:06:54.686 --rc genhtml_legend=1 00:06:54.686 --rc geninfo_all_blocks=1 00:06:54.686 --rc geninfo_unexecuted_blocks=1 00:06:54.686 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.686 ' 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:54.686 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:54.686 --rc genhtml_branch_coverage=1 00:06:54.686 --rc genhtml_function_coverage=1 00:06:54.686 --rc genhtml_legend=1 00:06:54.686 --rc geninfo_all_blocks=1 00:06:54.686 --rc geninfo_unexecuted_blocks=1 00:06:54.686 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:54.686 ' 00:06:54.686 12:37:24 event -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:54.686 12:37:24 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:54.686 12:37:24 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:06:54.686 12:37:24 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:54.686 12:37:24 event -- common/autotest_common.sh@10 -- # set +x 00:06:54.686 ************************************ 00:06:54.686 START TEST event_perf 00:06:54.686 ************************************ 00:06:54.686 12:37:24 event.event_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:54.686 Running I/O for 1 seconds...[2024-11-28 12:37:24.688185] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:54.686 [2024-11-28 12:37:24.688300] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598207 ] 00:06:54.949 [2024-11-28 12:37:24.830134] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:54.949 [2024-11-28 12:37:24.862674] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:54.949 [2024-11-28 12:37:24.892611] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:54.949 [2024-11-28 12:37:24.892697] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:54.949 [2024-11-28 12:37:24.892762] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:54.949 [2024-11-28 12:37:24.892763] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.885 Running I/O for 1 seconds... 00:06:55.885 lcore 0: 185071 00:06:55.885 lcore 1: 185069 00:06:55.885 lcore 2: 185070 00:06:55.885 lcore 3: 185071 00:06:55.885 done. 00:06:55.885 00:06:55.885 real 0m1.258s 00:06:55.885 user 0m4.069s 00:06:55.885 sys 0m0.080s 00:06:55.885 12:37:25 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:55.885 12:37:25 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:55.885 ************************************ 00:06:55.885 END TEST event_perf 00:06:55.885 ************************************ 00:06:55.885 12:37:25 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:55.885 12:37:25 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:55.885 12:37:25 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:55.885 12:37:25 event -- common/autotest_common.sh@10 -- # set +x 00:06:55.885 ************************************ 00:06:55.885 START TEST event_reactor 00:06:55.885 ************************************ 00:06:55.885 12:37:25 event.event_reactor -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:56.144 [2024-11-28 12:37:26.017667] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:56.144 [2024-11-28 12:37:26.017766] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598425 ] 00:06:56.144 [2024-11-28 12:37:26.157831] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:56.144 [2024-11-28 12:37:26.191534] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.144 [2024-11-28 12:37:26.214901] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.521 test_start 00:06:57.521 oneshot 00:06:57.521 tick 100 00:06:57.521 tick 100 00:06:57.521 tick 250 00:06:57.521 tick 100 00:06:57.521 tick 100 00:06:57.521 tick 100 00:06:57.521 tick 250 00:06:57.521 tick 500 00:06:57.521 tick 100 00:06:57.521 tick 100 00:06:57.521 tick 250 00:06:57.521 tick 100 00:06:57.521 tick 100 00:06:57.521 test_end 00:06:57.521 00:06:57.521 real 0m1.248s 00:06:57.521 user 0m1.069s 00:06:57.521 sys 0m0.076s 00:06:57.521 12:37:27 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:57.521 12:37:27 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:57.521 ************************************ 00:06:57.521 END TEST event_reactor 00:06:57.521 ************************************ 00:06:57.521 12:37:27 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:57.521 12:37:27 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:06:57.521 12:37:27 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:57.521 12:37:27 event -- common/autotest_common.sh@10 -- # set +x 00:06:57.521 ************************************ 00:06:57.521 START TEST event_reactor_perf 00:06:57.521 ************************************ 00:06:57.521 12:37:27 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:57.521 [2024-11-28 12:37:27.345781] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:57.521 [2024-11-28 12:37:27.345862] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598637 ] 00:06:57.521 [2024-11-28 12:37:27.484715] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:57.521 [2024-11-28 12:37:27.517758] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.521 [2024-11-28 12:37:27.544640] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.458 test_start 00:06:58.458 test_end 00:06:58.458 Performance: 938339 events per second 00:06:58.458 00:06:58.458 real 0m1.248s 00:06:58.458 user 0m1.059s 00:06:58.458 sys 0m0.085s 00:06:58.458 12:37:28 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:58.458 12:37:28 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:58.458 ************************************ 00:06:58.458 END TEST event_reactor_perf 00:06:58.458 ************************************ 00:06:58.718 12:37:28 event -- event/event.sh@49 -- # uname -s 00:06:58.718 12:37:28 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:58.718 12:37:28 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:58.718 12:37:28 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:58.718 12:37:28 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:58.718 12:37:28 event -- common/autotest_common.sh@10 -- # set +x 00:06:58.718 ************************************ 00:06:58.718 START TEST event_scheduler 00:06:58.718 ************************************ 00:06:58.718 12:37:28 event.event_scheduler -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:58.718 * Looking for test storage... 00:06:58.718 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:58.718 12:37:28 event.event_scheduler -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:06:58.718 12:37:28 event.event_scheduler -- common/autotest_common.sh@1693 -- # lcov --version 00:06:58.718 12:37:28 event.event_scheduler -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:06:58.718 12:37:28 event.event_scheduler -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:06:58.718 12:37:28 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:06:58.719 12:37:28 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:58.719 12:37:28 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:58.719 12:37:28 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:06:58.719 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:58.719 --rc genhtml_branch_coverage=1 00:06:58.719 --rc genhtml_function_coverage=1 00:06:58.719 --rc genhtml_legend=1 00:06:58.719 --rc geninfo_all_blocks=1 00:06:58.719 --rc geninfo_unexecuted_blocks=1 00:06:58.719 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:58.719 ' 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:06:58.719 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:58.719 --rc genhtml_branch_coverage=1 00:06:58.719 --rc genhtml_function_coverage=1 00:06:58.719 --rc genhtml_legend=1 00:06:58.719 --rc geninfo_all_blocks=1 00:06:58.719 --rc geninfo_unexecuted_blocks=1 00:06:58.719 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:58.719 ' 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:06:58.719 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:58.719 --rc genhtml_branch_coverage=1 00:06:58.719 --rc genhtml_function_coverage=1 00:06:58.719 --rc genhtml_legend=1 00:06:58.719 --rc geninfo_all_blocks=1 00:06:58.719 --rc geninfo_unexecuted_blocks=1 00:06:58.719 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:58.719 ' 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:06:58.719 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:58.719 --rc genhtml_branch_coverage=1 00:06:58.719 --rc genhtml_function_coverage=1 00:06:58.719 --rc genhtml_legend=1 00:06:58.719 --rc geninfo_all_blocks=1 00:06:58.719 --rc geninfo_unexecuted_blocks=1 00:06:58.719 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:58.719 ' 00:06:58.719 12:37:28 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:58.719 12:37:28 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=598933 00:06:58.719 12:37:28 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:58.719 12:37:28 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 598933 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 598933 ']' 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:58.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:58.719 12:37:28 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:58.719 12:37:28 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:58.719 [2024-11-28 12:37:28.842566] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:06:58.719 [2024-11-28 12:37:28.842656] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid598933 ] 00:06:58.981 [2024-11-28 12:37:28.980308] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:06:58.981 [2024-11-28 12:37:29.010492] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:58.981 [2024-11-28 12:37:29.037691] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.981 [2024-11-28 12:37:29.037716] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:06:58.981 [2024-11-28 12:37:29.037789] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:06:58.981 [2024-11-28 12:37:29.037791] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:06:59.918 12:37:29 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 [2024-11-28 12:37:29.698605] dpdk_governor.c: 178:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:06:59.918 [2024-11-28 12:37:29.698624] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:06:59.918 [2024-11-28 12:37:29.698636] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:59.918 [2024-11-28 12:37:29.698643] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:59.918 [2024-11-28 12:37:29.698651] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 [2024-11-28 12:37:29.768172] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 ************************************ 00:06:59.918 START TEST scheduler_create_thread 00:06:59.918 ************************************ 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 2 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 3 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 4 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 5 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 6 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 7 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 8 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 9 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 10 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:59.918 12:37:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:00.486 12:37:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:00.486 12:37:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:00.486 12:37:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:00.486 12:37:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:02.094 12:37:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:02.094 12:37:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:02.094 12:37:31 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:02.094 12:37:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:02.094 12:37:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.057 12:37:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:03.057 00:07:03.057 real 0m3.092s 00:07:03.057 user 0m0.023s 00:07:03.057 sys 0m0.009s 00:07:03.057 12:37:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.057 12:37:32 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:03.057 ************************************ 00:07:03.057 END TEST scheduler_create_thread 00:07:03.057 ************************************ 00:07:03.057 12:37:32 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:03.057 12:37:32 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 598933 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 598933 ']' 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 598933 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 598933 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 598933' 00:07:03.057 killing process with pid 598933 00:07:03.057 12:37:32 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 598933 00:07:03.057 12:37:33 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 598933 00:07:03.316 [2024-11-28 12:37:33.279331] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:03.575 00:07:03.575 real 0m4.800s 00:07:03.575 user 0m9.144s 00:07:03.575 sys 0m0.438s 00:07:03.575 12:37:33 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:03.575 12:37:33 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:03.575 ************************************ 00:07:03.575 END TEST event_scheduler 00:07:03.575 ************************************ 00:07:03.575 12:37:33 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:03.575 12:37:33 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:03.575 12:37:33 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:03.575 12:37:33 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:03.575 12:37:33 event -- common/autotest_common.sh@10 -- # set +x 00:07:03.575 ************************************ 00:07:03.575 START TEST app_repeat 00:07:03.575 ************************************ 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@19 -- # repeat_pid=599535 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 599535' 00:07:03.575 Process app_repeat pid: 599535 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:03.575 spdk_app_start Round 0 00:07:03.575 12:37:33 event.app_repeat -- event/event.sh@25 -- # waitforlisten 599535 /var/tmp/spdk-nbd.sock 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 599535 ']' 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:03.575 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:03.575 12:37:33 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:03.575 [2024-11-28 12:37:33.586609] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:03.575 [2024-11-28 12:37:33.586693] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid599535 ] 00:07:03.834 [2024-11-28 12:37:33.730218] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:03.834 [2024-11-28 12:37:33.761370] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:03.834 [2024-11-28 12:37:33.786487] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:03.834 [2024-11-28 12:37:33.786513] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.401 12:37:34 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:04.401 12:37:34 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:04.401 12:37:34 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:04.660 Malloc0 00:07:04.660 12:37:34 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:04.918 Malloc1 00:07:04.918 12:37:34 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:04.918 12:37:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:05.176 /dev/nbd0 00:07:05.176 12:37:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:05.176 12:37:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.176 1+0 records in 00:07:05.176 1+0 records out 00:07:05.176 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002299 s, 17.8 MB/s 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:05.176 12:37:35 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:05.176 12:37:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:05.176 12:37:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.176 12:37:35 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:05.434 /dev/nbd1 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:05.434 1+0 records in 00:07:05.434 1+0 records out 00:07:05.434 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235718 s, 17.4 MB/s 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:05.434 12:37:35 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:05.434 { 00:07:05.434 "nbd_device": "/dev/nbd0", 00:07:05.434 "bdev_name": "Malloc0" 00:07:05.434 }, 00:07:05.434 { 00:07:05.434 "nbd_device": "/dev/nbd1", 00:07:05.434 "bdev_name": "Malloc1" 00:07:05.434 } 00:07:05.434 ]' 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:05.434 { 00:07:05.434 "nbd_device": "/dev/nbd0", 00:07:05.434 "bdev_name": "Malloc0" 00:07:05.434 }, 00:07:05.434 { 00:07:05.434 "nbd_device": "/dev/nbd1", 00:07:05.434 "bdev_name": "Malloc1" 00:07:05.434 } 00:07:05.434 ]' 00:07:05.434 12:37:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:05.692 /dev/nbd1' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:05.692 /dev/nbd1' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:05.692 256+0 records in 00:07:05.692 256+0 records out 00:07:05.692 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111848 s, 93.8 MB/s 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:05.692 256+0 records in 00:07:05.692 256+0 records out 00:07:05.692 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.020101 s, 52.2 MB/s 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:05.692 256+0 records in 00:07:05.692 256+0 records out 00:07:05.692 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0219516 s, 47.8 MB/s 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:05.692 12:37:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:05.950 12:37:35 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:06.208 12:37:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:06.467 12:37:36 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:06.467 12:37:36 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:06.467 12:37:36 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:06.725 [2024-11-28 12:37:36.708447] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:06.725 [2024-11-28 12:37:36.731463] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.725 [2024-11-28 12:37:36.731466] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.725 [2024-11-28 12:37:36.771730] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:06.725 [2024-11-28 12:37:36.771775] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:10.010 12:37:39 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:10.010 12:37:39 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:10.010 spdk_app_start Round 1 00:07:10.010 12:37:39 event.app_repeat -- event/event.sh@25 -- # waitforlisten 599535 /var/tmp/spdk-nbd.sock 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 599535 ']' 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:10.010 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:10.010 12:37:39 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:10.010 12:37:39 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:10.010 Malloc0 00:07:10.010 12:37:39 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:10.269 Malloc1 00:07:10.269 12:37:40 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.269 12:37:40 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:10.269 /dev/nbd0 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:10.528 1+0 records in 00:07:10.528 1+0 records out 00:07:10.528 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241531 s, 17.0 MB/s 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:10.528 /dev/nbd1 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:10.528 12:37:40 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:10.528 12:37:40 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:10.788 1+0 records in 00:07:10.788 1+0 records out 00:07:10.788 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000149742 s, 27.4 MB/s 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:10.788 12:37:40 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:10.788 { 00:07:10.788 "nbd_device": "/dev/nbd0", 00:07:10.788 "bdev_name": "Malloc0" 00:07:10.788 }, 00:07:10.788 { 00:07:10.788 "nbd_device": "/dev/nbd1", 00:07:10.788 "bdev_name": "Malloc1" 00:07:10.788 } 00:07:10.788 ]' 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:10.788 { 00:07:10.788 "nbd_device": "/dev/nbd0", 00:07:10.788 "bdev_name": "Malloc0" 00:07:10.788 }, 00:07:10.788 { 00:07:10.788 "nbd_device": "/dev/nbd1", 00:07:10.788 "bdev_name": "Malloc1" 00:07:10.788 } 00:07:10.788 ]' 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:10.788 /dev/nbd1' 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:10.788 /dev/nbd1' 00:07:10.788 12:37:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:11.048 256+0 records in 00:07:11.048 256+0 records out 00:07:11.048 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0115155 s, 91.1 MB/s 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:11.048 256+0 records in 00:07:11.048 256+0 records out 00:07:11.048 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0204041 s, 51.4 MB/s 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:11.048 256+0 records in 00:07:11.048 256+0 records out 00:07:11.048 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0211531 s, 49.6 MB/s 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:11.048 12:37:40 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:11.048 12:37:41 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:11.308 12:37:41 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:11.567 12:37:41 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:11.567 12:37:41 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:11.826 12:37:41 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:12.087 [2024-11-28 12:37:42.041035] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:12.087 [2024-11-28 12:37:42.064362] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.087 [2024-11-28 12:37:42.064364] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.087 [2024-11-28 12:37:42.105517] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:12.087 [2024-11-28 12:37:42.105561] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:15.377 12:37:44 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:15.377 12:37:44 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:07:15.377 spdk_app_start Round 2 00:07:15.377 12:37:44 event.app_repeat -- event/event.sh@25 -- # waitforlisten 599535 /var/tmp/spdk-nbd.sock 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 599535 ']' 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:15.377 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:15.377 12:37:44 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:15.377 12:37:45 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:15.377 12:37:45 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:15.377 12:37:45 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.377 Malloc0 00:07:15.377 12:37:45 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:15.377 Malloc1 00:07:15.377 12:37:45 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.377 12:37:45 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.378 12:37:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:15.637 /dev/nbd0 00:07:15.637 12:37:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:15.637 12:37:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.637 1+0 records in 00:07:15.637 1+0 records out 00:07:15.637 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000141121 s, 29.0 MB/s 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.637 12:37:45 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.637 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.637 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.637 12:37:45 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:15.897 /dev/nbd1 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:15.897 1+0 records in 00:07:15.897 1+0 records out 00:07:15.897 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251935 s, 16.3 MB/s 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:15.897 12:37:45 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:15.897 12:37:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:16.156 { 00:07:16.156 "nbd_device": "/dev/nbd0", 00:07:16.156 "bdev_name": "Malloc0" 00:07:16.156 }, 00:07:16.156 { 00:07:16.156 "nbd_device": "/dev/nbd1", 00:07:16.156 "bdev_name": "Malloc1" 00:07:16.156 } 00:07:16.156 ]' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:16.156 { 00:07:16.156 "nbd_device": "/dev/nbd0", 00:07:16.156 "bdev_name": "Malloc0" 00:07:16.156 }, 00:07:16.156 { 00:07:16.156 "nbd_device": "/dev/nbd1", 00:07:16.156 "bdev_name": "Malloc1" 00:07:16.156 } 00:07:16.156 ]' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:16.156 /dev/nbd1' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:16.156 /dev/nbd1' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:16.156 256+0 records in 00:07:16.156 256+0 records out 00:07:16.156 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0110485 s, 94.9 MB/s 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:16.156 256+0 records in 00:07:16.156 256+0 records out 00:07:16.156 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206526 s, 50.8 MB/s 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:16.156 12:37:46 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:16.415 256+0 records in 00:07:16.415 256+0 records out 00:07:16.415 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0214065 s, 49.0 MB/s 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:16.415 12:37:46 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:16.674 12:37:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:16.933 12:37:46 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:16.933 12:37:46 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:17.192 12:37:47 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:17.451 [2024-11-28 12:37:47.328582] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:17.451 [2024-11-28 12:37:47.351808] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:17.451 [2024-11-28 12:37:47.351809] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.451 [2024-11-28 12:37:47.391916] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:17.451 [2024-11-28 12:37:47.391961] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:20.742 12:37:50 event.app_repeat -- event/event.sh@38 -- # waitforlisten 599535 /var/tmp/spdk-nbd.sock 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 599535 ']' 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:20.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:07:20.742 12:37:50 event.app_repeat -- event/event.sh@39 -- # killprocess 599535 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 599535 ']' 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 599535 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 599535 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 599535' 00:07:20.742 killing process with pid 599535 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@973 -- # kill 599535 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@978 -- # wait 599535 00:07:20.742 spdk_app_start is called in Round 0. 00:07:20.742 Shutdown signal received, stop current app iteration 00:07:20.742 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:07:20.742 spdk_app_start is called in Round 1. 00:07:20.742 Shutdown signal received, stop current app iteration 00:07:20.742 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:07:20.742 spdk_app_start is called in Round 2. 00:07:20.742 Shutdown signal received, stop current app iteration 00:07:20.742 Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 reinitialization... 00:07:20.742 spdk_app_start is called in Round 3. 00:07:20.742 Shutdown signal received, stop current app iteration 00:07:20.742 12:37:50 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:07:20.742 12:37:50 event.app_repeat -- event/event.sh@42 -- # return 0 00:07:20.742 00:07:20.742 real 0m17.012s 00:07:20.742 user 0m36.586s 00:07:20.742 sys 0m3.224s 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:20.742 12:37:50 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:20.742 ************************************ 00:07:20.742 END TEST app_repeat 00:07:20.742 ************************************ 00:07:20.742 12:37:50 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:07:20.742 12:37:50 event -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:20.742 12:37:50 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:20.742 12:37:50 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.742 12:37:50 event -- common/autotest_common.sh@10 -- # set +x 00:07:20.742 ************************************ 00:07:20.742 START TEST cpu_locks 00:07:20.742 ************************************ 00:07:20.742 12:37:50 event.cpu_locks -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:07:20.742 * Looking for test storage... 00:07:20.742 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # lcov --version 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:20.743 12:37:50 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:20.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.743 --rc genhtml_branch_coverage=1 00:07:20.743 --rc genhtml_function_coverage=1 00:07:20.743 --rc genhtml_legend=1 00:07:20.743 --rc geninfo_all_blocks=1 00:07:20.743 --rc geninfo_unexecuted_blocks=1 00:07:20.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.743 ' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:20.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.743 --rc genhtml_branch_coverage=1 00:07:20.743 --rc genhtml_function_coverage=1 00:07:20.743 --rc genhtml_legend=1 00:07:20.743 --rc geninfo_all_blocks=1 00:07:20.743 --rc geninfo_unexecuted_blocks=1 00:07:20.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.743 ' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:20.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.743 --rc genhtml_branch_coverage=1 00:07:20.743 --rc genhtml_function_coverage=1 00:07:20.743 --rc genhtml_legend=1 00:07:20.743 --rc geninfo_all_blocks=1 00:07:20.743 --rc geninfo_unexecuted_blocks=1 00:07:20.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.743 ' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:20.743 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:20.743 --rc genhtml_branch_coverage=1 00:07:20.743 --rc genhtml_function_coverage=1 00:07:20.743 --rc genhtml_legend=1 00:07:20.743 --rc geninfo_all_blocks=1 00:07:20.743 --rc geninfo_unexecuted_blocks=1 00:07:20.743 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:20.743 ' 00:07:20.743 12:37:50 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:07:20.743 12:37:50 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:07:20.743 12:37:50 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:07:20.743 12:37:50 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.743 12:37:50 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:21.003 ************************************ 00:07:21.003 START TEST default_locks 00:07:21.003 ************************************ 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=602052 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 602052 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 602052 ']' 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.003 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:21.003 12:37:50 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:21.003 [2024-11-28 12:37:50.906746] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:21.003 [2024-11-28 12:37:50.906833] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602052 ] 00:07:21.003 [2024-11-28 12:37:51.043461] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:21.003 [2024-11-28 12:37:51.078334] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.003 [2024-11-28 12:37:51.102146] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.940 12:37:51 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:21.940 12:37:51 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:07:21.940 12:37:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 602052 00:07:21.940 12:37:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 602052 00:07:21.940 12:37:51 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:22.199 lslocks: write error 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 602052 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 602052 ']' 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 602052 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 602052 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 602052' 00:07:22.199 killing process with pid 602052 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 602052 00:07:22.199 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 602052 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 602052 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 602052 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 602052 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 602052 ']' 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.459 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:22.459 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (602052) - No such process 00:07:22.459 ERROR: process (pid: 602052) is no longer running 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:22.459 00:07:22.459 real 0m1.635s 00:07:22.459 user 0m1.633s 00:07:22.459 sys 0m0.560s 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.459 12:37:52 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:07:22.459 ************************************ 00:07:22.459 END TEST default_locks 00:07:22.459 ************************************ 00:07:22.459 12:37:52 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:07:22.459 12:37:52 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:22.459 12:37:52 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:22.459 12:37:52 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:22.719 ************************************ 00:07:22.719 START TEST default_locks_via_rpc 00:07:22.719 ************************************ 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=602263 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 602263 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 602263 ']' 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:22.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:22.719 12:37:52 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:22.719 [2024-11-28 12:37:52.618160] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:22.719 [2024-11-28 12:37:52.618230] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602263 ] 00:07:22.719 [2024-11-28 12:37:52.754642] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:22.719 [2024-11-28 12:37:52.784365] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.719 [2024-11-28 12:37:52.808557] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 602263 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 602263 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 602263 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 602263 ']' 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 602263 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:23.656 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 602263 00:07:23.916 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:23.916 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:23.916 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 602263' 00:07:23.916 killing process with pid 602263 00:07:23.916 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 602263 00:07:23.916 12:37:53 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 602263 00:07:24.175 00:07:24.175 real 0m1.489s 00:07:24.175 user 0m1.479s 00:07:24.175 sys 0m0.515s 00:07:24.175 12:37:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:24.175 12:37:54 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:24.175 ************************************ 00:07:24.175 END TEST default_locks_via_rpc 00:07:24.175 ************************************ 00:07:24.175 12:37:54 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:07:24.175 12:37:54 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:24.175 12:37:54 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:24.175 12:37:54 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:24.175 ************************************ 00:07:24.175 START TEST non_locking_app_on_locked_coremask 00:07:24.175 ************************************ 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=602518 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 602518 /var/tmp/spdk.sock 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 602518 ']' 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:24.175 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:24.175 12:37:54 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:24.175 [2024-11-28 12:37:54.186400] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:24.175 [2024-11-28 12:37:54.186468] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602518 ] 00:07:24.434 [2024-11-28 12:37:54.323164] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:24.434 [2024-11-28 12:37:54.360626] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.434 [2024-11-28 12:37:54.384980] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=602648 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 602648 /var/tmp/spdk2.sock 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 602648 ']' 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:25.003 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:25.003 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:25.003 [2024-11-28 12:37:55.067278] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:25.003 [2024-11-28 12:37:55.067368] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid602648 ] 00:07:25.262 [2024-11-28 12:37:55.206039] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:25.262 [2024-11-28 12:37:55.263189] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:25.262 [2024-11-28 12:37:55.263214] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.262 [2024-11-28 12:37:55.310852] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.830 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:25.830 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:25.830 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 602518 00:07:25.830 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 602518 00:07:25.830 12:37:55 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:27.229 lslocks: write error 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 602518 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 602518 ']' 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 602518 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:27.229 12:37:56 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 602518 00:07:27.229 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:27.229 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:27.229 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 602518' 00:07:27.229 killing process with pid 602518 00:07:27.229 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 602518 00:07:27.229 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 602518 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 602648 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 602648 ']' 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 602648 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:27.489 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 602648 00:07:27.748 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:27.748 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:27.748 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 602648' 00:07:27.748 killing process with pid 602648 00:07:27.748 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 602648 00:07:27.748 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 602648 00:07:28.008 00:07:28.008 real 0m3.786s 00:07:28.008 user 0m3.980s 00:07:28.008 sys 0m1.288s 00:07:28.008 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:28.008 12:37:57 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:28.008 ************************************ 00:07:28.008 END TEST non_locking_app_on_locked_coremask 00:07:28.008 ************************************ 00:07:28.008 12:37:57 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:07:28.008 12:37:57 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:28.008 12:37:57 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:28.008 12:37:57 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:28.008 ************************************ 00:07:28.008 START TEST locking_app_on_unlocked_coremask 00:07:28.008 ************************************ 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=603037 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 603037 /var/tmp/spdk.sock 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 603037 ']' 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:28.008 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:28.008 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:28.008 [2024-11-28 12:37:58.054220] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:28.008 [2024-11-28 12:37:58.054305] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid603037 ] 00:07:28.267 [2024-11-28 12:37:58.192190] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:28.267 [2024-11-28 12:37:58.229968] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:28.267 [2024-11-28 12:37:58.229994] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.267 [2024-11-28 12:37:58.253770] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=603209 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 603209 /var/tmp/spdk2.sock 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 603209 ']' 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:28.835 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:28.835 12:37:58 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:28.836 [2024-11-28 12:37:58.925602] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:28.836 [2024-11-28 12:37:58.925668] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid603209 ] 00:07:29.095 [2024-11-28 12:37:59.066060] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:29.095 [2024-11-28 12:37:59.123800] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.095 [2024-11-28 12:37:59.171844] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.032 12:37:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:30.032 12:37:59 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:30.032 12:37:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 603209 00:07:30.032 12:37:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 603209 00:07:30.032 12:37:59 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:30.971 lslocks: write error 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 603037 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 603037 ']' 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 603037 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 603037 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 603037' 00:07:30.971 killing process with pid 603037 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 603037 00:07:30.971 12:38:00 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 603037 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 603209 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 603209 ']' 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 603209 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 603209 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 603209' 00:07:31.540 killing process with pid 603209 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 603209 00:07:31.540 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 603209 00:07:31.800 00:07:31.800 real 0m3.734s 00:07:31.800 user 0m3.904s 00:07:31.800 sys 0m1.262s 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.800 ************************************ 00:07:31.800 END TEST locking_app_on_unlocked_coremask 00:07:31.800 ************************************ 00:07:31.800 12:38:01 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:07:31.800 12:38:01 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.800 12:38:01 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.800 12:38:01 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:31.800 ************************************ 00:07:31.800 START TEST locking_app_on_locked_coremask 00:07:31.800 ************************************ 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=603602 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 603602 /var/tmp/spdk.sock 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 603602 ']' 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:31.800 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:31.800 12:38:01 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:31.800 [2024-11-28 12:38:01.855996] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:31.800 [2024-11-28 12:38:01.856045] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid603602 ] 00:07:32.059 [2024-11-28 12:38:01.990290] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:32.059 [2024-11-28 12:38:02.024442] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.059 [2024-11-28 12:38:02.048568] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=603774 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 603774 /var/tmp/spdk2.sock 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 603774 /var/tmp/spdk2.sock 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 603774 /var/tmp/spdk2.sock 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 603774 ']' 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:32.628 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:32.628 12:38:02 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:32.628 [2024-11-28 12:38:02.733032] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:32.628 [2024-11-28 12:38:02.733083] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid603774 ] 00:07:32.887 [2024-11-28 12:38:02.867954] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:32.887 [2024-11-28 12:38:02.924427] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 603602 has claimed it. 00:07:32.887 [2024-11-28 12:38:02.924458] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:33.454 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (603774) - No such process 00:07:33.455 ERROR: process (pid: 603774) is no longer running 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 603602 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 603602 00:07:33.455 12:38:03 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:07:34.023 lslocks: write error 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 603602 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 603602 ']' 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 603602 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 603602 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 603602' 00:07:34.023 killing process with pid 603602 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 603602 00:07:34.023 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 603602 00:07:34.282 00:07:34.282 real 0m2.552s 00:07:34.282 user 0m2.719s 00:07:34.282 sys 0m0.748s 00:07:34.282 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:34.282 12:38:04 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:34.282 ************************************ 00:07:34.282 END TEST locking_app_on_locked_coremask 00:07:34.282 ************************************ 00:07:34.541 12:38:04 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:07:34.541 12:38:04 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:34.541 12:38:04 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:34.541 12:38:04 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:34.541 ************************************ 00:07:34.541 START TEST locking_overlapped_coremask 00:07:34.541 ************************************ 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=603989 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 603989 /var/tmp/spdk.sock 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 603989 ']' 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:34.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:34.541 12:38:04 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:34.541 [2024-11-28 12:38:04.499143] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:34.541 [2024-11-28 12:38:04.499208] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid603989 ] 00:07:34.541 [2024-11-28 12:38:04.635875] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:34.800 [2024-11-28 12:38:04.669963] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:34.800 [2024-11-28 12:38:04.695244] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:34.800 [2024-11-28 12:38:04.695333] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:34.800 [2024-11-28 12:38:04.695334] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=604166 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 604166 /var/tmp/spdk2.sock 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 604166 /var/tmp/spdk2.sock 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 604166 /var/tmp/spdk2.sock 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 604166 ']' 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:35.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:35.367 12:38:05 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:35.367 [2024-11-28 12:38:05.375315] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:35.367 [2024-11-28 12:38:05.375384] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604166 ] 00:07:35.626 [2024-11-28 12:38:05.515332] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:35.626 [2024-11-28 12:38:05.573129] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 603989 has claimed it. 00:07:35.626 [2024-11-28 12:38:05.573162] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:36.194 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 850: kill: (604166) - No such process 00:07:36.194 ERROR: process (pid: 604166) is no longer running 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 603989 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 603989 ']' 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 603989 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 603989 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 603989' 00:07:36.194 killing process with pid 603989 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 603989 00:07:36.194 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 603989 00:07:36.455 00:07:36.455 real 0m1.914s 00:07:36.455 user 0m5.299s 00:07:36.455 sys 0m0.440s 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:07:36.455 ************************************ 00:07:36.455 END TEST locking_overlapped_coremask 00:07:36.455 ************************************ 00:07:36.455 12:38:06 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:36.455 12:38:06 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:36.455 12:38:06 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:36.455 12:38:06 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:36.455 ************************************ 00:07:36.455 START TEST locking_overlapped_coremask_via_rpc 00:07:36.455 ************************************ 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=604364 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 604364 /var/tmp/spdk.sock 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 604364 ']' 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:36.455 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:36.455 12:38:06 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:36.455 [2024-11-28 12:38:06.487623] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:36.455 [2024-11-28 12:38:06.487669] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604364 ] 00:07:36.714 [2024-11-28 12:38:06.622005] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:36.714 [2024-11-28 12:38:06.656559] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:36.714 [2024-11-28 12:38:06.656583] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:36.714 [2024-11-28 12:38:06.682910] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:36.714 [2024-11-28 12:38:06.682996] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:36.715 [2024-11-28 12:38:06.682998] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=604387 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 604387 /var/tmp/spdk2.sock 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 604387 ']' 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:37.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:37.283 12:38:07 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:37.283 [2024-11-28 12:38:07.372853] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:37.283 [2024-11-28 12:38:07.372921] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604387 ] 00:07:37.542 [2024-11-28 12:38:07.511324] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:37.542 [2024-11-28 12:38:07.569366] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:37.542 [2024-11-28 12:38:07.569388] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:37.542 [2024-11-28 12:38:07.623711] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 3 00:07:37.542 [2024-11-28 12:38:07.627522] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 2 00:07:37.542 [2024-11-28 12:38:07.627525] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 4 00:07:38.109 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:38.109 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:38.109 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:38.109 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:38.109 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.367 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.368 [2024-11-28 12:38:08.252540] app.c: 782:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 604364 has claimed it. 00:07:38.368 request: 00:07:38.368 { 00:07:38.368 "method": "framework_enable_cpumask_locks", 00:07:38.368 "req_id": 1 00:07:38.368 } 00:07:38.368 Got JSON-RPC error response 00:07:38.368 response: 00:07:38.368 { 00:07:38.368 "code": -32603, 00:07:38.368 "message": "Failed to claim CPU core: 2" 00:07:38.368 } 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 604364 /var/tmp/spdk.sock 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 604364 ']' 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:38.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 604387 /var/tmp/spdk2.sock 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 604387 ']' 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:38.368 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:38.368 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:38.627 00:07:38.627 real 0m2.203s 00:07:38.627 user 0m0.935s 00:07:38.627 sys 0m0.197s 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.627 12:38:08 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.627 ************************************ 00:07:38.627 END TEST locking_overlapped_coremask_via_rpc 00:07:38.627 ************************************ 00:07:38.627 12:38:08 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:07:38.627 12:38:08 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 604364 ]] 00:07:38.627 12:38:08 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 604364 00:07:38.627 12:38:08 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 604364 ']' 00:07:38.627 12:38:08 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 604364 00:07:38.627 12:38:08 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:38.627 12:38:08 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:38.627 12:38:08 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 604364 00:07:38.887 12:38:08 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:38.887 12:38:08 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:38.887 12:38:08 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 604364' 00:07:38.887 killing process with pid 604364 00:07:38.887 12:38:08 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 604364 00:07:38.887 12:38:08 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 604364 00:07:39.147 12:38:09 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 604387 ]] 00:07:39.147 12:38:09 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 604387 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 604387 ']' 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 604387 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 604387 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 604387' 00:07:39.147 killing process with pid 604387 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 604387 00:07:39.147 12:38:09 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 604387 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 604364 ]] 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 604364 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 604364 ']' 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 604364 00:07:39.405 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (604364) - No such process 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 604364 is not found' 00:07:39.405 Process with pid 604364 is not found 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 604387 ]] 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 604387 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 604387 ']' 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 604387 00:07:39.405 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 958: kill: (604387) - No such process 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 604387 is not found' 00:07:39.405 Process with pid 604387 is not found 00:07:39.405 12:38:09 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:07:39.405 00:07:39.405 real 0m18.786s 00:07:39.405 user 0m30.924s 00:07:39.405 sys 0m6.075s 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.405 12:38:09 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:07:39.405 ************************************ 00:07:39.405 END TEST cpu_locks 00:07:39.405 ************************************ 00:07:39.405 00:07:39.405 real 0m45.034s 00:07:39.405 user 1m23.133s 00:07:39.405 sys 0m10.426s 00:07:39.405 12:38:09 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.405 12:38:09 event -- common/autotest_common.sh@10 -- # set +x 00:07:39.405 ************************************ 00:07:39.405 END TEST event 00:07:39.405 ************************************ 00:07:39.405 12:38:09 -- spdk/autotest.sh@169 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:39.405 12:38:09 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:39.405 12:38:09 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.405 12:38:09 -- common/autotest_common.sh@10 -- # set +x 00:07:39.664 ************************************ 00:07:39.664 START TEST thread 00:07:39.664 ************************************ 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:39.664 * Looking for test storage... 00:07:39.664 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1693 -- # lcov --version 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:39.664 12:38:09 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:39.664 12:38:09 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:39.664 12:38:09 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:39.664 12:38:09 thread -- scripts/common.sh@336 -- # IFS=.-: 00:07:39.664 12:38:09 thread -- scripts/common.sh@336 -- # read -ra ver1 00:07:39.664 12:38:09 thread -- scripts/common.sh@337 -- # IFS=.-: 00:07:39.664 12:38:09 thread -- scripts/common.sh@337 -- # read -ra ver2 00:07:39.664 12:38:09 thread -- scripts/common.sh@338 -- # local 'op=<' 00:07:39.664 12:38:09 thread -- scripts/common.sh@340 -- # ver1_l=2 00:07:39.664 12:38:09 thread -- scripts/common.sh@341 -- # ver2_l=1 00:07:39.664 12:38:09 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:39.664 12:38:09 thread -- scripts/common.sh@344 -- # case "$op" in 00:07:39.664 12:38:09 thread -- scripts/common.sh@345 -- # : 1 00:07:39.664 12:38:09 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:39.664 12:38:09 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:39.664 12:38:09 thread -- scripts/common.sh@365 -- # decimal 1 00:07:39.664 12:38:09 thread -- scripts/common.sh@353 -- # local d=1 00:07:39.664 12:38:09 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:39.664 12:38:09 thread -- scripts/common.sh@355 -- # echo 1 00:07:39.664 12:38:09 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:07:39.664 12:38:09 thread -- scripts/common.sh@366 -- # decimal 2 00:07:39.664 12:38:09 thread -- scripts/common.sh@353 -- # local d=2 00:07:39.664 12:38:09 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:39.664 12:38:09 thread -- scripts/common.sh@355 -- # echo 2 00:07:39.664 12:38:09 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:07:39.664 12:38:09 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:39.664 12:38:09 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:39.664 12:38:09 thread -- scripts/common.sh@368 -- # return 0 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:39.664 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.664 --rc genhtml_branch_coverage=1 00:07:39.664 --rc genhtml_function_coverage=1 00:07:39.664 --rc genhtml_legend=1 00:07:39.664 --rc geninfo_all_blocks=1 00:07:39.664 --rc geninfo_unexecuted_blocks=1 00:07:39.664 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.664 ' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:39.664 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.664 --rc genhtml_branch_coverage=1 00:07:39.664 --rc genhtml_function_coverage=1 00:07:39.664 --rc genhtml_legend=1 00:07:39.664 --rc geninfo_all_blocks=1 00:07:39.664 --rc geninfo_unexecuted_blocks=1 00:07:39.664 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.664 ' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:39.664 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.664 --rc genhtml_branch_coverage=1 00:07:39.664 --rc genhtml_function_coverage=1 00:07:39.664 --rc genhtml_legend=1 00:07:39.664 --rc geninfo_all_blocks=1 00:07:39.664 --rc geninfo_unexecuted_blocks=1 00:07:39.664 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.664 ' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:39.664 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:39.664 --rc genhtml_branch_coverage=1 00:07:39.664 --rc genhtml_function_coverage=1 00:07:39.664 --rc genhtml_legend=1 00:07:39.664 --rc geninfo_all_blocks=1 00:07:39.664 --rc geninfo_unexecuted_blocks=1 00:07:39.664 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:39.664 ' 00:07:39.664 12:38:09 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.664 12:38:09 thread -- common/autotest_common.sh@10 -- # set +x 00:07:39.664 ************************************ 00:07:39.664 START TEST thread_poller_perf 00:07:39.664 ************************************ 00:07:39.664 12:38:09 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:39.921 [2024-11-28 12:38:09.794429] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:39.921 [2024-11-28 12:38:09.794519] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid604836 ] 00:07:39.921 [2024-11-28 12:38:09.934836] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:39.921 [2024-11-28 12:38:09.968273] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:39.921 [2024-11-28 12:38:09.992524] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.921 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:41.296 [2024-11-28T11:38:11.423Z] ====================================== 00:07:41.296 [2024-11-28T11:38:11.423Z] busy:2299140916 (cyc) 00:07:41.296 [2024-11-28T11:38:11.423Z] total_run_count: 787000 00:07:41.296 [2024-11-28T11:38:11.423Z] tsc_hz: 2294600000 (cyc) 00:07:41.296 [2024-11-28T11:38:11.423Z] ====================================== 00:07:41.296 [2024-11-28T11:38:11.423Z] poller_cost: 2921 (cyc), 1272 (nsec) 00:07:41.296 00:07:41.296 real 0m1.255s 00:07:41.296 user 0m1.071s 00:07:41.296 sys 0m0.079s 00:07:41.296 12:38:11 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:41.296 12:38:11 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:41.296 ************************************ 00:07:41.296 END TEST thread_poller_perf 00:07:41.296 ************************************ 00:07:41.296 12:38:11 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:41.296 12:38:11 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:07:41.296 12:38:11 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:41.296 12:38:11 thread -- common/autotest_common.sh@10 -- # set +x 00:07:41.296 ************************************ 00:07:41.296 START TEST thread_poller_perf 00:07:41.296 ************************************ 00:07:41.296 12:38:11 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:41.296 [2024-11-28 12:38:11.132003] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:41.296 [2024-11-28 12:38:11.132104] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid605031 ] 00:07:41.296 [2024-11-28 12:38:11.272837] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:41.296 [2024-11-28 12:38:11.308896] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.296 [2024-11-28 12:38:11.335926] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.296 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:42.675 [2024-11-28T11:38:12.802Z] ====================================== 00:07:42.675 [2024-11-28T11:38:12.802Z] busy:2296030224 (cyc) 00:07:42.675 [2024-11-28T11:38:12.802Z] total_run_count: 13231000 00:07:42.675 [2024-11-28T11:38:12.802Z] tsc_hz: 2294600000 (cyc) 00:07:42.675 [2024-11-28T11:38:12.802Z] ====================================== 00:07:42.675 [2024-11-28T11:38:12.802Z] poller_cost: 173 (cyc), 75 (nsec) 00:07:42.675 00:07:42.675 real 0m1.255s 00:07:42.675 user 0m1.063s 00:07:42.675 sys 0m0.087s 00:07:42.675 12:38:12 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.675 12:38:12 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:07:42.675 ************************************ 00:07:42.675 END TEST thread_poller_perf 00:07:42.675 ************************************ 00:07:42.675 12:38:12 thread -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:42.675 12:38:12 thread -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:42.675 12:38:12 thread -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:42.675 12:38:12 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.675 12:38:12 thread -- common/autotest_common.sh@10 -- # set +x 00:07:42.675 ************************************ 00:07:42.675 START TEST thread_spdk_lock 00:07:42.675 ************************************ 00:07:42.675 12:38:12 thread.thread_spdk_lock -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:42.675 [2024-11-28 12:38:12.468382] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:42.675 [2024-11-28 12:38:12.468468] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid605233 ] 00:07:42.675 [2024-11-28 12:38:12.609452] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:42.675 [2024-11-28 12:38:12.641686] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:42.675 [2024-11-28 12:38:12.666023] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.675 [2024-11-28 12:38:12.666026] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.243 [2024-11-28 12:38:13.155607] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 980:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:43.243 [2024-11-28 12:38:13.155640] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3112:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:43.243 [2024-11-28 12:38:13.155667] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3067:sspin_stacks_print: *ERROR*: spinlock 0x1361200 00:07:43.243 [2024-11-28 12:38:13.156284] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:43.243 [2024-11-28 12:38:13.156388] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1041:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:43.243 [2024-11-28 12:38:13.156406] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 875:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:43.243 Starting test contend 00:07:43.243 Worker Delay Wait us Hold us Total us 00:07:43.243 0 3 168137 186855 354992 00:07:43.243 1 5 89895 286194 376090 00:07:43.243 PASS test contend 00:07:43.243 Starting test hold_by_poller 00:07:43.243 PASS test hold_by_poller 00:07:43.243 Starting test hold_by_message 00:07:43.243 PASS test hold_by_message 00:07:43.243 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:43.243 100014 assertions passed 00:07:43.243 0 assertions failed 00:07:43.243 00:07:43.243 real 0m0.739s 00:07:43.243 user 0m1.045s 00:07:43.243 sys 0m0.081s 00:07:43.243 12:38:13 thread.thread_spdk_lock -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.243 12:38:13 thread.thread_spdk_lock -- common/autotest_common.sh@10 -- # set +x 00:07:43.243 ************************************ 00:07:43.243 END TEST thread_spdk_lock 00:07:43.243 ************************************ 00:07:43.243 00:07:43.243 real 0m3.684s 00:07:43.243 user 0m3.380s 00:07:43.243 sys 0m0.517s 00:07:43.243 12:38:13 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.243 12:38:13 thread -- common/autotest_common.sh@10 -- # set +x 00:07:43.244 ************************************ 00:07:43.244 END TEST thread 00:07:43.244 ************************************ 00:07:43.244 12:38:13 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:07:43.244 12:38:13 -- spdk/autotest.sh@176 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:43.244 12:38:13 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:43.244 12:38:13 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:43.244 12:38:13 -- common/autotest_common.sh@10 -- # set +x 00:07:43.244 ************************************ 00:07:43.244 START TEST app_cmdline 00:07:43.244 ************************************ 00:07:43.244 12:38:13 app_cmdline -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:43.504 * Looking for test storage... 00:07:43.504 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1693 -- # lcov --version 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@345 -- # : 1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:43.504 12:38:13 app_cmdline -- scripts/common.sh@368 -- # return 0 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:43.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.504 --rc genhtml_branch_coverage=1 00:07:43.504 --rc genhtml_function_coverage=1 00:07:43.504 --rc genhtml_legend=1 00:07:43.504 --rc geninfo_all_blocks=1 00:07:43.504 --rc geninfo_unexecuted_blocks=1 00:07:43.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.504 ' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:43.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.504 --rc genhtml_branch_coverage=1 00:07:43.504 --rc genhtml_function_coverage=1 00:07:43.504 --rc genhtml_legend=1 00:07:43.504 --rc geninfo_all_blocks=1 00:07:43.504 --rc geninfo_unexecuted_blocks=1 00:07:43.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.504 ' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:43.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.504 --rc genhtml_branch_coverage=1 00:07:43.504 --rc genhtml_function_coverage=1 00:07:43.504 --rc genhtml_legend=1 00:07:43.504 --rc geninfo_all_blocks=1 00:07:43.504 --rc geninfo_unexecuted_blocks=1 00:07:43.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.504 ' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:43.504 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:43.504 --rc genhtml_branch_coverage=1 00:07:43.504 --rc genhtml_function_coverage=1 00:07:43.504 --rc genhtml_legend=1 00:07:43.504 --rc geninfo_all_blocks=1 00:07:43.504 --rc geninfo_unexecuted_blocks=1 00:07:43.504 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:43.504 ' 00:07:43.504 12:38:13 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:43.504 12:38:13 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=605474 00:07:43.504 12:38:13 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 605474 00:07:43.504 12:38:13 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 605474 ']' 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:43.504 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:43.504 12:38:13 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:43.504 [2024-11-28 12:38:13.529004] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:43.504 [2024-11-28 12:38:13.529073] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid605474 ] 00:07:43.764 [2024-11-28 12:38:13.664582] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:43.764 [2024-11-28 12:38:13.699201] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.764 [2024-11-28 12:38:13.722795] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.332 12:38:14 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:44.332 12:38:14 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:07:44.332 12:38:14 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:44.592 { 00:07:44.592 "version": "SPDK v25.01-pre git sha1 35cd3e84d", 00:07:44.592 "fields": { 00:07:44.592 "major": 25, 00:07:44.592 "minor": 1, 00:07:44.592 "patch": 0, 00:07:44.592 "suffix": "-pre", 00:07:44.592 "commit": "35cd3e84d" 00:07:44.592 } 00:07:44.592 } 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:44.592 12:38:14 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@644 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@646 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@646 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:44.592 12:38:14 app_cmdline -- common/autotest_common.sh@655 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:44.851 request: 00:07:44.851 { 00:07:44.851 "method": "env_dpdk_get_mem_stats", 00:07:44.851 "req_id": 1 00:07:44.851 } 00:07:44.851 Got JSON-RPC error response 00:07:44.851 response: 00:07:44.851 { 00:07:44.851 "code": -32601, 00:07:44.851 "message": "Method not found" 00:07:44.851 } 00:07:44.851 12:38:14 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:07:44.851 12:38:14 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:07:44.851 12:38:14 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:07:44.852 12:38:14 app_cmdline -- app/cmdline.sh@1 -- # killprocess 605474 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 605474 ']' 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 605474 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 605474 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 605474' 00:07:44.852 killing process with pid 605474 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@973 -- # kill 605474 00:07:44.852 12:38:14 app_cmdline -- common/autotest_common.sh@978 -- # wait 605474 00:07:45.111 00:07:45.112 real 0m1.841s 00:07:45.112 user 0m2.065s 00:07:45.112 sys 0m0.516s 00:07:45.112 12:38:15 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.112 12:38:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:45.112 ************************************ 00:07:45.112 END TEST app_cmdline 00:07:45.112 ************************************ 00:07:45.112 12:38:15 -- spdk/autotest.sh@177 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:45.112 12:38:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.112 12:38:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.112 12:38:15 -- common/autotest_common.sh@10 -- # set +x 00:07:45.112 ************************************ 00:07:45.112 START TEST version 00:07:45.112 ************************************ 00:07:45.112 12:38:15 version -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:45.372 * Looking for test storage... 00:07:45.372 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.372 12:38:15 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.372 12:38:15 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.372 12:38:15 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.372 12:38:15 version -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.372 12:38:15 version -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.372 12:38:15 version -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.372 12:38:15 version -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.372 12:38:15 version -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.372 12:38:15 version -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.372 12:38:15 version -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.372 12:38:15 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.372 12:38:15 version -- scripts/common.sh@344 -- # case "$op" in 00:07:45.372 12:38:15 version -- scripts/common.sh@345 -- # : 1 00:07:45.372 12:38:15 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.372 12:38:15 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.372 12:38:15 version -- scripts/common.sh@365 -- # decimal 1 00:07:45.372 12:38:15 version -- scripts/common.sh@353 -- # local d=1 00:07:45.372 12:38:15 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.372 12:38:15 version -- scripts/common.sh@355 -- # echo 1 00:07:45.372 12:38:15 version -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.372 12:38:15 version -- scripts/common.sh@366 -- # decimal 2 00:07:45.372 12:38:15 version -- scripts/common.sh@353 -- # local d=2 00:07:45.372 12:38:15 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.372 12:38:15 version -- scripts/common.sh@355 -- # echo 2 00:07:45.372 12:38:15 version -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.372 12:38:15 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.372 12:38:15 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.372 12:38:15 version -- scripts/common.sh@368 -- # return 0 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.372 --rc genhtml_branch_coverage=1 00:07:45.372 --rc genhtml_function_coverage=1 00:07:45.372 --rc genhtml_legend=1 00:07:45.372 --rc geninfo_all_blocks=1 00:07:45.372 --rc geninfo_unexecuted_blocks=1 00:07:45.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.372 ' 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.372 --rc genhtml_branch_coverage=1 00:07:45.372 --rc genhtml_function_coverage=1 00:07:45.372 --rc genhtml_legend=1 00:07:45.372 --rc geninfo_all_blocks=1 00:07:45.372 --rc geninfo_unexecuted_blocks=1 00:07:45.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.372 ' 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.372 --rc genhtml_branch_coverage=1 00:07:45.372 --rc genhtml_function_coverage=1 00:07:45.372 --rc genhtml_legend=1 00:07:45.372 --rc geninfo_all_blocks=1 00:07:45.372 --rc geninfo_unexecuted_blocks=1 00:07:45.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.372 ' 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.372 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.372 --rc genhtml_branch_coverage=1 00:07:45.372 --rc genhtml_function_coverage=1 00:07:45.372 --rc genhtml_legend=1 00:07:45.372 --rc geninfo_all_blocks=1 00:07:45.372 --rc geninfo_unexecuted_blocks=1 00:07:45.372 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.372 ' 00:07:45.372 12:38:15 version -- app/version.sh@17 -- # get_header_version major 00:07:45.372 12:38:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # cut -f2 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # tr -d '"' 00:07:45.372 12:38:15 version -- app/version.sh@17 -- # major=25 00:07:45.372 12:38:15 version -- app/version.sh@18 -- # get_header_version minor 00:07:45.372 12:38:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # cut -f2 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # tr -d '"' 00:07:45.372 12:38:15 version -- app/version.sh@18 -- # minor=1 00:07:45.372 12:38:15 version -- app/version.sh@19 -- # get_header_version patch 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # cut -f2 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # tr -d '"' 00:07:45.372 12:38:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:45.372 12:38:15 version -- app/version.sh@19 -- # patch=0 00:07:45.372 12:38:15 version -- app/version.sh@20 -- # get_header_version suffix 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # tr -d '"' 00:07:45.372 12:38:15 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:45.372 12:38:15 version -- app/version.sh@14 -- # cut -f2 00:07:45.372 12:38:15 version -- app/version.sh@20 -- # suffix=-pre 00:07:45.372 12:38:15 version -- app/version.sh@22 -- # version=25.1 00:07:45.372 12:38:15 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:45.372 12:38:15 version -- app/version.sh@28 -- # version=25.1rc0 00:07:45.372 12:38:15 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:45.372 12:38:15 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:45.372 12:38:15 version -- app/version.sh@30 -- # py_version=25.1rc0 00:07:45.372 12:38:15 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:07:45.372 00:07:45.372 real 0m0.256s 00:07:45.372 user 0m0.138s 00:07:45.372 sys 0m0.165s 00:07:45.372 12:38:15 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:45.372 12:38:15 version -- common/autotest_common.sh@10 -- # set +x 00:07:45.372 ************************************ 00:07:45.372 END TEST version 00:07:45.372 ************************************ 00:07:45.632 12:38:15 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@194 -- # uname -s 00:07:45.632 12:38:15 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@260 -- # timing_exit lib 00:07:45.632 12:38:15 -- common/autotest_common.sh@732 -- # xtrace_disable 00:07:45.632 12:38:15 -- common/autotest_common.sh@10 -- # set +x 00:07:45.632 12:38:15 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:07:45.632 12:38:15 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@374 -- # [[ 1 -eq 1 ]] 00:07:45.632 12:38:15 -- spdk/autotest.sh@375 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:45.632 12:38:15 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.632 12:38:15 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.632 12:38:15 -- common/autotest_common.sh@10 -- # set +x 00:07:45.632 ************************************ 00:07:45.632 START TEST llvm_fuzz 00:07:45.632 ************************************ 00:07:45.632 12:38:15 llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:45.632 * Looking for test storage... 00:07:45.632 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:45.632 12:38:15 llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.632 12:38:15 llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.632 12:38:15 llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.892 12:38:15 llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.892 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.892 --rc genhtml_branch_coverage=1 00:07:45.892 --rc genhtml_function_coverage=1 00:07:45.892 --rc genhtml_legend=1 00:07:45.892 --rc geninfo_all_blocks=1 00:07:45.892 --rc geninfo_unexecuted_blocks=1 00:07:45.892 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.892 ' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.892 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.892 --rc genhtml_branch_coverage=1 00:07:45.892 --rc genhtml_function_coverage=1 00:07:45.892 --rc genhtml_legend=1 00:07:45.892 --rc geninfo_all_blocks=1 00:07:45.892 --rc geninfo_unexecuted_blocks=1 00:07:45.892 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.892 ' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.892 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.892 --rc genhtml_branch_coverage=1 00:07:45.892 --rc genhtml_function_coverage=1 00:07:45.892 --rc genhtml_legend=1 00:07:45.892 --rc geninfo_all_blocks=1 00:07:45.892 --rc geninfo_unexecuted_blocks=1 00:07:45.892 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.892 ' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.892 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.892 --rc genhtml_branch_coverage=1 00:07:45.892 --rc genhtml_function_coverage=1 00:07:45.892 --rc genhtml_legend=1 00:07:45.892 --rc geninfo_all_blocks=1 00:07:45.892 --rc geninfo_unexecuted_blocks=1 00:07:45.892 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.892 ' 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@550 -- # fuzzers=() 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@550 -- # local fuzzers 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@552 -- # [[ -n '' ]] 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@555 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@556 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@559 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:45.892 12:38:15 llvm_fuzz -- fuzz/llvm.sh@19 -- # run_test nvmf_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:45.892 12:38:15 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:07:45.892 ************************************ 00:07:45.892 START TEST nvmf_llvm_fuzz 00:07:45.892 ************************************ 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:45.892 * Looking for test storage... 00:07:45.892 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:45.892 12:38:15 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:45.892 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:45.892 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.892 --rc genhtml_branch_coverage=1 00:07:45.892 --rc genhtml_function_coverage=1 00:07:45.893 --rc genhtml_legend=1 00:07:45.893 --rc geninfo_all_blocks=1 00:07:45.893 --rc geninfo_unexecuted_blocks=1 00:07:45.893 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.893 ' 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:45.893 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.893 --rc genhtml_branch_coverage=1 00:07:45.893 --rc genhtml_function_coverage=1 00:07:45.893 --rc genhtml_legend=1 00:07:45.893 --rc geninfo_all_blocks=1 00:07:45.893 --rc geninfo_unexecuted_blocks=1 00:07:45.893 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.893 ' 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:45.893 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.893 --rc genhtml_branch_coverage=1 00:07:45.893 --rc genhtml_function_coverage=1 00:07:45.893 --rc genhtml_legend=1 00:07:45.893 --rc geninfo_all_blocks=1 00:07:45.893 --rc geninfo_unexecuted_blocks=1 00:07:45.893 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.893 ' 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:45.893 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:45.893 --rc genhtml_branch_coverage=1 00:07:45.893 --rc genhtml_function_coverage=1 00:07:45.893 --rc genhtml_legend=1 00:07:45.893 --rc geninfo_all_blocks=1 00:07:45.893 --rc geninfo_unexecuted_blocks=1 00:07:45.893 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:45.893 ' 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:45.893 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:46.156 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:46.156 #define SPDK_CONFIG_H 00:07:46.156 #define SPDK_CONFIG_AIO_FSDEV 1 00:07:46.156 #define SPDK_CONFIG_APPS 1 00:07:46.156 #define SPDK_CONFIG_ARCH native 00:07:46.156 #undef SPDK_CONFIG_ASAN 00:07:46.156 #undef SPDK_CONFIG_AVAHI 00:07:46.156 #undef SPDK_CONFIG_CET 00:07:46.156 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:07:46.156 #define SPDK_CONFIG_COVERAGE 1 00:07:46.156 #define SPDK_CONFIG_CROSS_PREFIX 00:07:46.156 #undef SPDK_CONFIG_CRYPTO 00:07:46.156 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:46.156 #undef SPDK_CONFIG_CUSTOMOCF 00:07:46.156 #undef SPDK_CONFIG_DAOS 00:07:46.156 #define SPDK_CONFIG_DAOS_DIR 00:07:46.156 #define SPDK_CONFIG_DEBUG 1 00:07:46.156 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:46.156 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:46.156 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:07:46.156 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:46.156 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:46.156 #undef SPDK_CONFIG_DPDK_UADK 00:07:46.156 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:46.156 #define SPDK_CONFIG_EXAMPLES 1 00:07:46.156 #undef SPDK_CONFIG_FC 00:07:46.156 #define SPDK_CONFIG_FC_PATH 00:07:46.156 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:46.156 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:46.156 #define SPDK_CONFIG_FSDEV 1 00:07:46.157 #undef SPDK_CONFIG_FUSE 00:07:46.157 #define SPDK_CONFIG_FUZZER 1 00:07:46.157 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:46.157 #undef SPDK_CONFIG_GOLANG 00:07:46.157 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:46.157 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:46.157 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:46.157 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:07:46.157 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:46.157 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:46.157 #undef SPDK_CONFIG_HAVE_LZ4 00:07:46.157 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:07:46.157 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:07:46.157 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:46.157 #define SPDK_CONFIG_IDXD 1 00:07:46.157 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:46.157 #undef SPDK_CONFIG_IPSEC_MB 00:07:46.157 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:46.157 #define SPDK_CONFIG_ISAL 1 00:07:46.157 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:46.157 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:46.157 #define SPDK_CONFIG_LIBDIR 00:07:46.157 #undef SPDK_CONFIG_LTO 00:07:46.157 #define SPDK_CONFIG_MAX_LCORES 128 00:07:46.157 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:07:46.157 #define SPDK_CONFIG_NVME_CUSE 1 00:07:46.157 #undef SPDK_CONFIG_OCF 00:07:46.157 #define SPDK_CONFIG_OCF_PATH 00:07:46.157 #define SPDK_CONFIG_OPENSSL_PATH 00:07:46.157 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:46.157 #define SPDK_CONFIG_PGO_DIR 00:07:46.157 #undef SPDK_CONFIG_PGO_USE 00:07:46.157 #define SPDK_CONFIG_PREFIX /usr/local 00:07:46.157 #undef SPDK_CONFIG_RAID5F 00:07:46.157 #undef SPDK_CONFIG_RBD 00:07:46.157 #define SPDK_CONFIG_RDMA 1 00:07:46.157 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:46.157 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:46.157 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:46.157 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:46.157 #undef SPDK_CONFIG_SHARED 00:07:46.157 #undef SPDK_CONFIG_SMA 00:07:46.157 #define SPDK_CONFIG_TESTS 1 00:07:46.157 #undef SPDK_CONFIG_TSAN 00:07:46.157 #define SPDK_CONFIG_UBLK 1 00:07:46.157 #define SPDK_CONFIG_UBSAN 1 00:07:46.157 #undef SPDK_CONFIG_UNIT_TESTS 00:07:46.157 #undef SPDK_CONFIG_URING 00:07:46.157 #define SPDK_CONFIG_URING_PATH 00:07:46.157 #undef SPDK_CONFIG_URING_ZNS 00:07:46.157 #undef SPDK_CONFIG_USDT 00:07:46.157 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:46.157 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:46.157 #define SPDK_CONFIG_VFIO_USER 1 00:07:46.157 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:46.157 #define SPDK_CONFIG_VHOST 1 00:07:46.157 #define SPDK_CONFIG_VIRTIO 1 00:07:46.157 #undef SPDK_CONFIG_VTUNE 00:07:46.157 #define SPDK_CONFIG_VTUNE_DIR 00:07:46.157 #define SPDK_CONFIG_WERROR 1 00:07:46.157 #define SPDK_CONFIG_WPDK_DIR 00:07:46.157 #undef SPDK_CONFIG_XNVME 00:07:46.157 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # uname -s 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:07:46.157 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:46.158 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j72 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 605991 ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 605991 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.vVkBzv 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.vVkBzv/tests/nvmf /tmp/spdk.vVkBzv 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=84729458688 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=94500356096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9770897408 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47245414400 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250178048 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=18894340096 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=18900074496 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5734400 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47249674240 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250178048 00:07:46.159 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=503808 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=9450020864 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=9450033152 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:07:46.160 * Looking for test storage... 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=84729458688 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11985489920 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:46.160 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:07:46.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.160 --rc genhtml_branch_coverage=1 00:07:46.160 --rc genhtml_function_coverage=1 00:07:46.160 --rc genhtml_legend=1 00:07:46.160 --rc geninfo_all_blocks=1 00:07:46.160 --rc geninfo_unexecuted_blocks=1 00:07:46.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.160 ' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:07:46.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.160 --rc genhtml_branch_coverage=1 00:07:46.160 --rc genhtml_function_coverage=1 00:07:46.160 --rc genhtml_legend=1 00:07:46.160 --rc geninfo_all_blocks=1 00:07:46.160 --rc geninfo_unexecuted_blocks=1 00:07:46.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.160 ' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:07:46.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.160 --rc genhtml_branch_coverage=1 00:07:46.160 --rc genhtml_function_coverage=1 00:07:46.160 --rc genhtml_legend=1 00:07:46.160 --rc geninfo_all_blocks=1 00:07:46.160 --rc geninfo_unexecuted_blocks=1 00:07:46.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.160 ' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:07:46.160 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:46.160 --rc genhtml_branch_coverage=1 00:07:46.160 --rc genhtml_function_coverage=1 00:07:46.160 --rc genhtml_legend=1 00:07:46.160 --rc geninfo_all_blocks=1 00:07:46.160 --rc geninfo_unexecuted_blocks=1 00:07:46.160 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:46.160 ' 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:07:46.160 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@69 -- # mem_size=512 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:46.161 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=25 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 0 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4400 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:46.421 12:38:16 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:46.421 [2024-11-28 12:38:16.325366] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:46.421 [2024-11-28 12:38:16.325428] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid606070 ] 00:07:46.681 [2024-11-28 12:38:16.568966] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:46.681 [2024-11-28 12:38:16.615935] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:46.681 [2024-11-28 12:38:16.632205] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.681 [2024-11-28 12:38:16.685117] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:46.681 [2024-11-28 12:38:16.701234] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:46.681 INFO: Running with entropic power schedule (0xFF, 100). 00:07:46.681 INFO: Seed: 1254882901 00:07:46.681 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:46.681 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:46.681 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:46.681 INFO: A corpus is not provided, starting from an empty corpus 00:07:46.681 #2 INITED exec/s: 0 rss: 66Mb 00:07:46.681 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:46.681 This may also happen if the target rejected all inputs we tried so far 00:07:46.681 [2024-11-28 12:38:16.756528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:46.681 [2024-11-28 12:38:16.756556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.250 NEW_FUNC[1/716]: 0x45ed08 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:47.250 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:47.250 #8 NEW cov: 12235 ft: 12234 corp: 2/115b lim: 320 exec/s: 0 rss: 73Mb L: 114/114 MS: 1 InsertRepeatedBytes- 00:07:47.250 [2024-11-28 12:38:17.108428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.250 [2024-11-28 12:38:17.108487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.250 #24 NEW cov: 12348 ft: 12835 corp: 3/230b lim: 320 exec/s: 0 rss: 73Mb L: 115/115 MS: 1 InsertByte- 00:07:47.250 [2024-11-28 12:38:17.188744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.250 [2024-11-28 12:38:17.188779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.250 #30 NEW cov: 12354 ft: 13127 corp: 4/330b lim: 320 exec/s: 0 rss: 73Mb L: 100/115 MS: 1 EraseBytes- 00:07:47.250 [2024-11-28 12:38:17.239088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.250 [2024-11-28 12:38:17.239122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.250 #34 NEW cov: 12439 ft: 13317 corp: 5/435b lim: 320 exec/s: 0 rss: 73Mb L: 105/115 MS: 4 CrossOver-CMP-CopyPart-CopyPart- DE: "\001\000\000\000\000\000\000\000"- 00:07:47.250 [2024-11-28 12:38:17.289469] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.250 [2024-11-28 12:38:17.289500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.250 #35 NEW cov: 12439 ft: 13492 corp: 6/557b lim: 320 exec/s: 0 rss: 73Mb L: 122/122 MS: 1 CrossOver- 00:07:47.250 [2024-11-28 12:38:17.359505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1999999 00:07:47.250 [2024-11-28 12:38:17.359532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.509 #36 NEW cov: 12439 ft: 13564 corp: 7/662b lim: 320 exec/s: 0 rss: 74Mb L: 105/122 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:47.509 [2024-11-28 12:38:17.429985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.509 [2024-11-28 12:38:17.430012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.509 NEW_FUNC[1/1]: 0x1997408 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:47.509 #37 NEW cov: 12474 ft: 14004 corp: 8/750b lim: 320 exec/s: 0 rss: 74Mb L: 88/122 MS: 1 EraseBytes- 00:07:47.509 [2024-11-28 12:38:17.500051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:99999999 cdw10:9999997a cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.509 [2024-11-28 12:38:17.500080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.509 #38 NEW cov: 12474 ft: 14094 corp: 9/838b lim: 320 exec/s: 0 rss: 74Mb L: 88/122 MS: 1 ChangeByte- 00:07:47.509 [2024-11-28 12:38:17.570427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:7e999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.509 [2024-11-28 12:38:17.570454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.509 #39 NEW cov: 12474 ft: 14150 corp: 10/953b lim: 320 exec/s: 0 rss: 74Mb L: 115/122 MS: 1 ChangeByte- 00:07:47.768 [2024-11-28 12:38:17.641407] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:47.768 [2024-11-28 12:38:17.641434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.768 [2024-11-28 12:38:17.641536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:47.768 [2024-11-28 12:38:17.641554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.768 [2024-11-28 12:38:17.641655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:6 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.641670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.768 NEW_FUNC[1/2]: 0x1560a88 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2213 00:07:47.768 NEW_FUNC[2/2]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:47.768 #40 NEW cov: 12528 ft: 14502 corp: 11/1171b lim: 320 exec/s: 0 rss: 74Mb L: 218/218 MS: 1 InsertRepeatedBytes- 00:07:47.768 [2024-11-28 12:38:17.720843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.720872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.768 #41 NEW cov: 12528 ft: 14547 corp: 12/1267b lim: 320 exec/s: 41 rss: 74Mb L: 96/218 MS: 1 CMP- DE: "\000J\275\371\220\366\350\352"- 00:07:47.768 [2024-11-28 12:38:17.770851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:47.768 [2024-11-28 12:38:17.770880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.768 #43 NEW cov: 12528 ft: 14568 corp: 13/1351b lim: 320 exec/s: 43 rss: 74Mb L: 84/218 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:47.768 [2024-11-28 12:38:17.821736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:ffffffff cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.821763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.768 [2024-11-28 12:38:17.821861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:5 nsid:99999999 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.821876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.768 #48 NEW cov: 12528 ft: 15072 corp: 14/1488b lim: 320 exec/s: 48 rss: 74Mb L: 137/218 MS: 5 CrossOver-ShuffleBytes-InsertRepeatedBytes-CMP-CrossOver- DE: "\377\377\377\377\377\377\377\377"- 00:07:47.768 [2024-11-28 12:38:17.872306] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:47.768 [2024-11-28 12:38:17.872332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.768 [2024-11-28 12:38:17.872419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:5 nsid:55555555 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.872434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.768 [2024-11-28 12:38:17.872546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (55) qid:0 cid:6 nsid:55555555 cdw10:55555555 cdw11:55555555 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.768 [2024-11-28 12:38:17.872561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.027 #49 NEW cov: 12528 ft: 15110 corp: 15/1726b lim: 320 exec/s: 49 rss: 74Mb L: 238/238 MS: 1 InsertRepeatedBytes- 00:07:48.027 [2024-11-28 12:38:17.921666] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:48.027 [2024-11-28 12:38:17.921693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.027 #50 NEW cov: 12528 ft: 15145 corp: 16/1811b lim: 320 exec/s: 50 rss: 74Mb L: 85/238 MS: 1 InsertByte- 00:07:48.027 [2024-11-28 12:38:17.991788] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1999999 00:07:48.027 [2024-11-28 12:38:17.991815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.027 #51 NEW cov: 12528 ft: 15184 corp: 17/1916b lim: 320 exec/s: 51 rss: 74Mb L: 105/238 MS: 1 CMP- DE: "\001\000\000\005"- 00:07:48.027 [2024-11-28 12:38:18.041944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.027 [2024-11-28 12:38:18.041974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.027 #52 NEW cov: 12528 ft: 15203 corp: 18/2004b lim: 320 exec/s: 52 rss: 74Mb L: 88/238 MS: 1 ChangeByte- 00:07:48.027 [2024-11-28 12:38:18.092239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff0000 00:07:48.027 [2024-11-28 12:38:18.092267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.028 #53 NEW cov: 12528 ft: 15219 corp: 19/2089b lim: 320 exec/s: 53 rss: 74Mb L: 85/238 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:48.286 [2024-11-28 12:38:18.162945] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.286 [2024-11-28 12:38:18.162972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.287 [2024-11-28 12:38:18.163071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:5 nsid:99999999 cdw10:12121212 cdw11:99121212 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.287 [2024-11-28 12:38:18.163088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.287 #54 NEW cov: 12528 ft: 15236 corp: 20/2222b lim: 320 exec/s: 54 rss: 74Mb L: 133/238 MS: 1 InsertRepeatedBytes- 00:07:48.287 [2024-11-28 12:38:18.212856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.287 [2024-11-28 12:38:18.212885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.287 #55 NEW cov: 12528 ft: 15248 corp: 21/2336b lim: 320 exec/s: 55 rss: 74Mb L: 114/238 MS: 1 ChangeByte- 00:07:48.287 [2024-11-28 12:38:18.263024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.287 [2024-11-28 12:38:18.263052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.287 #56 NEW cov: 12528 ft: 15267 corp: 22/2424b lim: 320 exec/s: 56 rss: 74Mb L: 88/238 MS: 1 CopyPart- 00:07:48.287 [2024-11-28 12:38:18.313075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0xdf99999999999999 00:07:48.287 [2024-11-28 12:38:18.313103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.287 #57 NEW cov: 12528 ft: 15324 corp: 23/2546b lim: 320 exec/s: 57 rss: 74Mb L: 122/238 MS: 1 ChangeByte- 00:07:48.287 [2024-11-28 12:38:18.363074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.287 [2024-11-28 12:38:18.363102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.287 #58 NEW cov: 12528 ft: 15378 corp: 24/2662b lim: 320 exec/s: 58 rss: 74Mb L: 116/238 MS: 1 InsertByte- 00:07:48.546 [2024-11-28 12:38:18.413649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:4 nsid:ffffffff cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.546 [2024-11-28 12:38:18.413678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.547 [2024-11-28 12:38:18.413775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:5 nsid:99999999 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.547 [2024-11-28 12:38:18.413793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.547 #59 NEW cov: 12528 ft: 15397 corp: 25/2799b lim: 320 exec/s: 59 rss: 74Mb L: 137/238 MS: 1 ChangeByte- 00:07:48.547 [2024-11-28 12:38:18.493339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:7e999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.547 [2024-11-28 12:38:18.493367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.547 #60 NEW cov: 12528 ft: 15402 corp: 26/2914b lim: 320 exec/s: 60 rss: 74Mb L: 115/238 MS: 1 ChangeBit- 00:07:48.547 [2024-11-28 12:38:18.563417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:9999996e SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.547 [2024-11-28 12:38:18.563446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.547 #61 NEW cov: 12528 ft: 15414 corp: 27/3030b lim: 320 exec/s: 61 rss: 74Mb L: 116/238 MS: 1 ChangeBinInt- 00:07:48.547 [2024-11-28 12:38:18.633388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:99999999 SGL TRANSPORT DATA BLOCK TRANSPORT 0x9999999999999999 00:07:48.547 [2024-11-28 12:38:18.633415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.547 #62 NEW cov: 12528 ft: 15474 corp: 28/3146b lim: 320 exec/s: 62 rss: 74Mb L: 116/238 MS: 1 ChangeByte- 00:07:48.807 [2024-11-28 12:38:18.683621] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffff0000 00:07:48.807 [2024-11-28 12:38:18.683649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.807 #63 NEW cov: 12528 ft: 15514 corp: 29/3231b lim: 320 exec/s: 63 rss: 74Mb L: 85/238 MS: 1 ChangeBinInt- 00:07:48.807 [2024-11-28 12:38:18.754348] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:48.807 [2024-11-28 12:38:18.754377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.807 [2024-11-28 12:38:18.754490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:5 nsid:ffffffff cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:48.807 [2024-11-28 12:38:18.754517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.807 [2024-11-28 12:38:18.754620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (99) qid:0 cid:6 nsid:99999999 cdw10:99999999 cdw11:99999999 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.807 [2024-11-28 12:38:18.754635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.807 #64 pulse cov: 12528 ft: 15540 corp: 29/3231b lim: 320 exec/s: 32 rss: 75Mb 00:07:48.807 #64 NEW cov: 12528 ft: 15540 corp: 30/3454b lim: 320 exec/s: 32 rss: 75Mb L: 223/238 MS: 1 InsertRepeatedBytes- 00:07:48.807 #64 DONE cov: 12528 ft: 15540 corp: 30/3454b lim: 320 exec/s: 32 rss: 75Mb 00:07:48.807 ###### Recommended dictionary. ###### 00:07:48.807 "\001\000\000\000\000\000\000\000" # Uses: 2 00:07:48.807 "\000J\275\371\220\366\350\352" # Uses: 0 00:07:48.807 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:48.807 "\001\000\000\005" # Uses: 0 00:07:48.807 ###### End of recommended dictionary. ###### 00:07:48.807 Done 64 runs in 2 second(s) 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4401 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:48.807 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:48.808 12:38:18 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:49.067 [2024-11-28 12:38:18.944256] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:49.067 [2024-11-28 12:38:18.944342] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid606430 ] 00:07:49.068 [2024-11-28 12:38:19.184928] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:49.327 [2024-11-28 12:38:19.230565] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:49.327 [2024-11-28 12:38:19.245459] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:49.327 [2024-11-28 12:38:19.298177] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:49.327 [2024-11-28 12:38:19.314292] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:49.327 INFO: Running with entropic power schedule (0xFF, 100). 00:07:49.327 INFO: Seed: 3865883081 00:07:49.327 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:49.327 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:49.327 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:49.327 INFO: A corpus is not provided, starting from an empty corpus 00:07:49.327 #2 INITED exec/s: 0 rss: 66Mb 00:07:49.327 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:49.327 This may also happen if the target rejected all inputs we tried so far 00:07:49.327 [2024-11-28 12:38:19.361865] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.327 [2024-11-28 12:38:19.361992] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.327 [2024-11-28 12:38:19.362214] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.327 [2024-11-28 12:38:19.362244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.327 [2024-11-28 12:38:19.362302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.327 [2024-11-28 12:38:19.362317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.587 NEW_FUNC[1/717]: 0x45f608 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:49.587 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:49.587 #12 NEW cov: 12318 ft: 12317 corp: 2/18b lim: 30 exec/s: 0 rss: 73Mb L: 17/17 MS: 5 CopyPart-ShuffleBytes-ShuffleBytes-CrossOver-InsertRepeatedBytes- 00:07:49.587 [2024-11-28 12:38:19.701916] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.587 [2024-11-28 12:38:19.702047] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.587 [2024-11-28 12:38:19.702269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.587 [2024-11-28 12:38:19.702313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.587 [2024-11-28 12:38:19.702370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.587 [2024-11-28 12:38:19.702384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.847 #13 NEW cov: 12431 ft: 12998 corp: 3/30b lim: 30 exec/s: 0 rss: 73Mb L: 12/17 MS: 1 CrossOver- 00:07:49.848 [2024-11-28 12:38:19.741820] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.741956] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.742066] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:49.848 [2024-11-28 12:38:19.742279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.742306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.742364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ea3102ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.742378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.742433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.742447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.848 #14 NEW cov: 12437 ft: 13449 corp: 4/48b lim: 30 exec/s: 0 rss: 73Mb L: 18/18 MS: 1 InsertByte- 00:07:49.848 [2024-11-28 12:38:19.801815] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.801954] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.802177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.802203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.802259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.802274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.848 #15 NEW cov: 12522 ft: 13736 corp: 5/60b lim: 30 exec/s: 0 rss: 73Mb L: 12/18 MS: 1 CopyPart- 00:07:49.848 [2024-11-28 12:38:19.861908] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.862051] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000eaea 00:07:49.848 [2024-11-28 12:38:19.862166] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.862376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.862403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.862459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d3d383d3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.862479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.862536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.862550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.848 #16 NEW cov: 12522 ft: 13826 corp: 6/81b lim: 30 exec/s: 0 rss: 73Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:07:49.848 [2024-11-28 12:38:19.901853] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.901969] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eaea 00:07:49.848 [2024-11-28 12:38:19.902079] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.902296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.902321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.902378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea81ea cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.902392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.902447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.902460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.848 #17 NEW cov: 12522 ft: 13877 corp: 7/101b lim: 30 exec/s: 0 rss: 74Mb L: 20/21 MS: 1 CrossOver- 00:07:49.848 [2024-11-28 12:38:19.961893] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.962027] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:49.848 [2024-11-28 12:38:19.962137] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:49.848 [2024-11-28 12:38:19.962347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.962371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.962425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea0221 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.962439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.848 [2024-11-28 12:38:19.962500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:49.848 [2024-11-28 12:38:19.962520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.107 #18 NEW cov: 12522 ft: 13959 corp: 8/119b lim: 30 exec/s: 0 rss: 74Mb L: 18/21 MS: 1 InsertByte- 00:07:50.107 [2024-11-28 12:38:20.001885] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000eb0a 00:07:50.107 [2024-11-28 12:38:20.002099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ebeb83eb cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.002125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.107 #19 NEW cov: 12522 ft: 14384 corp: 9/125b lim: 30 exec/s: 0 rss: 74Mb L: 6/21 MS: 1 InsertRepeatedBytes- 00:07:50.107 [2024-11-28 12:38:20.042023] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.107 [2024-11-28 12:38:20.042154] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (764844) > buf size (4096) 00:07:50.107 [2024-11-28 12:38:20.042377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.042408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.107 [2024-11-28 12:38:20.042468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.042490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.107 #30 NEW cov: 12545 ft: 14426 corp: 10/137b lim: 30 exec/s: 0 rss: 74Mb L: 12/21 MS: 1 ChangeByte- 00:07:50.107 [2024-11-28 12:38:20.102087] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.107 [2024-11-28 12:38:20.102214] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000eaea 00:07:50.107 [2024-11-28 12:38:20.102323] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007373 00:07:50.107 [2024-11-28 12:38:20.102432] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300007373 00:07:50.107 [2024-11-28 12:38:20.102546] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:50.107 [2024-11-28 12:38:20.102770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.102797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.107 [2024-11-28 12:38:20.102855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d3d383d3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.102870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.107 [2024-11-28 12:38:20.102926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea83ea cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.102940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.107 [2024-11-28 12:38:20.102995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:73738373 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.107 [2024-11-28 12:38:20.103008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.108 [2024-11-28 12:38:20.103061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.108 [2024-11-28 12:38:20.103075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:50.108 #31 NEW cov: 12545 ft: 15024 corp: 11/167b lim: 30 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:50.108 [2024-11-28 12:38:20.162007] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.108 [2024-11-28 12:38:20.162128] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (240556) > buf size (4096) 00:07:50.108 [2024-11-28 12:38:20.162341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.108 [2024-11-28 12:38:20.162367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.108 [2024-11-28 12:38:20.162426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.108 [2024-11-28 12:38:20.162440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.108 #32 NEW cov: 12545 ft: 15091 corp: 12/184b lim: 30 exec/s: 0 rss: 74Mb L: 17/30 MS: 1 InsertRepeatedBytes- 00:07:50.108 [2024-11-28 12:38:20.202008] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2000041ea 00:07:50.108 [2024-11-28 12:38:20.202129] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.108 [2024-11-28 12:38:20.202340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.108 [2024-11-28 12:38:20.202366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.108 [2024-11-28 12:38:20.202421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.108 [2024-11-28 12:38:20.202435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.108 #33 NEW cov: 12545 ft: 15176 corp: 13/196b lim: 30 exec/s: 0 rss: 74Mb L: 12/30 MS: 1 ChangeByte- 00:07:50.367 [2024-11-28 12:38:20.242017] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.242138] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (764844) > buf size (4096) 00:07:50.367 [2024-11-28 12:38:20.242344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.242370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.242425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.242440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.367 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:50.367 #34 NEW cov: 12568 ft: 15282 corp: 14/208b lim: 30 exec/s: 0 rss: 74Mb L: 12/30 MS: 1 CopyPart- 00:07:50.367 [2024-11-28 12:38:20.302089] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.302225] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.302337] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:50.367 [2024-11-28 12:38:20.302548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ca cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.302574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.302630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea0221 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.302647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.302703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.302717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.367 #35 NEW cov: 12568 ft: 15336 corp: 15/226b lim: 30 exec/s: 0 rss: 74Mb L: 18/30 MS: 1 ChangeBit- 00:07:50.367 [2024-11-28 12:38:20.362076] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:50.367 [2024-11-28 12:38:20.362195] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.362411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.362437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.362495] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.362509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.367 #36 NEW cov: 12568 ft: 15413 corp: 16/238b lim: 30 exec/s: 36 rss: 74Mb L: 12/30 MS: 1 ChangeBinInt- 00:07:50.367 [2024-11-28 12:38:20.402117] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.402252] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ea31 00:07:50.367 [2024-11-28 12:38:20.402366] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.402618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:eaea020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.402644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.402700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.402714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.402767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.402782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.367 #37 NEW cov: 12568 ft: 15466 corp: 17/260b lim: 30 exec/s: 37 rss: 74Mb L: 22/30 MS: 1 CopyPart- 00:07:50.367 [2024-11-28 12:38:20.462186] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.462305] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d3d3 00:07:50.367 [2024-11-28 12:38:20.462417] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.367 [2024-11-28 12:38:20.462535] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:50.367 [2024-11-28 12:38:20.462754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.462779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.462849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d3ea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.462867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.462922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d3ea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.462936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.367 [2024-11-28 12:38:20.462988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.367 [2024-11-28 12:38:20.463001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.367 #38 NEW cov: 12568 ft: 15484 corp: 18/284b lim: 30 exec/s: 38 rss: 74Mb L: 24/30 MS: 1 CopyPart- 00:07:50.626 [2024-11-28 12:38:20.502272] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.626 [2024-11-28 12:38:20.502512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.502536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.502593] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0c00020c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.502607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.626 #39 NEW cov: 12585 ft: 15558 corp: 19/296b lim: 30 exec/s: 39 rss: 74Mb L: 12/30 MS: 1 CopyPart- 00:07:50.626 [2024-11-28 12:38:20.562165] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.626 [2024-11-28 12:38:20.562283] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (764844) > buf size (4096) 00:07:50.626 [2024-11-28 12:38:20.562509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ea0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.562534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.562591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.562604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.626 #40 NEW cov: 12585 ft: 15634 corp: 20/308b lim: 30 exec/s: 40 rss: 74Mb L: 12/30 MS: 1 ShuffleBytes- 00:07:50.626 [2024-11-28 12:38:20.622181] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.626 [2024-11-28 12:38:20.622300] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (764844) > buf size (4096) 00:07:50.626 [2024-11-28 12:38:20.622509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ea0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.622534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.622587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.622600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.626 #41 NEW cov: 12585 ft: 15654 corp: 21/320b lim: 30 exec/s: 41 rss: 74Mb L: 12/30 MS: 1 ChangeBit- 00:07:50.626 [2024-11-28 12:38:20.682237] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xeaea 00:07:50.626 [2024-11-28 12:38:20.682379] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (240556) > buf size (4096) 00:07:50.626 [2024-11-28 12:38:20.682499] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.626 [2024-11-28 12:38:20.682715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a00ea cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.682741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.682798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea00ea cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.682812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.682866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:000002ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.682880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.626 #42 NEW cov: 12585 ft: 15673 corp: 22/338b lim: 30 exec/s: 42 rss: 74Mb L: 18/30 MS: 1 InsertByte- 00:07:50.626 [2024-11-28 12:38:20.742255] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004aea 00:07:50.626 [2024-11-28 12:38:20.742374] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.626 [2024-11-28 12:38:20.742492] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:50.626 [2024-11-28 12:38:20.742705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.742731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.742785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.742799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.626 [2024-11-28 12:38:20.742852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.626 [2024-11-28 12:38:20.742866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.885 #43 NEW cov: 12585 ft: 15693 corp: 23/356b lim: 30 exec/s: 43 rss: 74Mb L: 18/30 MS: 1 InsertByte- 00:07:50.885 [2024-11-28 12:38:20.782276] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.885 [2024-11-28 12:38:20.782400] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ea32 00:07:50.885 [2024-11-28 12:38:20.782522] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.885 [2024-11-28 12:38:20.782741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.782767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.782823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea81ea cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.782838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.782891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.782912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.885 #44 NEW cov: 12585 ft: 15710 corp: 24/377b lim: 30 exec/s: 44 rss: 74Mb L: 21/30 MS: 1 InsertByte- 00:07:50.885 [2024-11-28 12:38:20.822251] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10652) > buf size (4096) 00:07:50.885 [2024-11-28 12:38:20.822386] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.885 [2024-11-28 12:38:20.822620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a660000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.822646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.822700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020c cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.822715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.885 #45 NEW cov: 12585 ft: 15761 corp: 25/389b lim: 30 exec/s: 45 rss: 74Mb L: 12/30 MS: 1 CMP- DE: "f\000\000\000"- 00:07:50.885 [2024-11-28 12:38:20.862340] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.885 [2024-11-28 12:38:20.862461] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000ea32 00:07:50.885 [2024-11-28 12:38:20.862583] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ea66 00:07:50.885 [2024-11-28 12:38:20.862802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.862828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.862886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea81ea cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.862900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.862957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.862971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.885 #46 NEW cov: 12585 ft: 15793 corp: 26/410b lim: 30 exec/s: 46 rss: 74Mb L: 21/30 MS: 1 PersAutoDict- DE: "f\000\000\000"- 00:07:50.885 [2024-11-28 12:38:20.922414] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xea 00:07:50.885 [2024-11-28 12:38:20.922544] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000d3d3 00:07:50.885 [2024-11-28 12:38:20.922660] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.885 [2024-11-28 12:38:20.922768] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ea0a 00:07:50.885 [2024-11-28 12:38:20.922996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a660000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.885 [2024-11-28 12:38:20.923022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.885 [2024-11-28 12:38:20.923078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.923092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:20.923146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:d3d302ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.923163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:20.923219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.923232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:50.886 #47 NEW cov: 12585 ft: 15806 corp: 27/435b lim: 30 exec/s: 47 rss: 74Mb L: 25/30 MS: 1 PersAutoDict- DE: "f\000\000\000"- 00:07:50.886 [2024-11-28 12:38:20.962446] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.886 [2024-11-28 12:38:20.962583] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000eaca 00:07:50.886 [2024-11-28 12:38:20.962702] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.886 [2024-11-28 12:38:20.962922] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.962949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:20.963008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:d3d383d3 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.963024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:20.963084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:20.963099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.886 #48 NEW cov: 12585 ft: 15820 corp: 28/456b lim: 30 exec/s: 48 rss: 74Mb L: 21/30 MS: 1 ChangeBit- 00:07:50.886 [2024-11-28 12:38:21.002436] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xe0ea 00:07:50.886 [2024-11-28 12:38:21.002581] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.886 [2024-11-28 12:38:21.002695] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eaea 00:07:50.886 [2024-11-28 12:38:21.002807] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:50.886 [2024-11-28 12:38:21.003028] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:eae000e0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:21.003055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:21.003111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:21.003126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:21.003182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea81ea cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:21.003195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.886 [2024-11-28 12:38:21.003251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.886 [2024-11-28 12:38:21.003265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.145 #49 NEW cov: 12585 ft: 15855 corp: 29/482b lim: 30 exec/s: 49 rss: 75Mb L: 26/30 MS: 1 InsertRepeatedBytes- 00:07:51.145 [2024-11-28 12:38:21.062411] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:51.145 [2024-11-28 12:38:21.062561] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (764844) > buf size (4096) 00:07:51.145 [2024-11-28 12:38:21.062774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ea0a83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.062800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.062856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.062871] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.145 #50 NEW cov: 12585 ft: 15862 corp: 30/494b lim: 30 exec/s: 50 rss: 75Mb L: 12/30 MS: 1 CMP- DE: "\377\377\377\377"- 00:07:51.145 [2024-11-28 12:38:21.102520] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xe0ea 00:07:51.145 [2024-11-28 12:38:21.102641] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.145 [2024-11-28 12:38:21.102754] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x100006161 00:07:51.145 [2024-11-28 12:38:21.102864] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.145 [2024-11-28 12:38:21.102983] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:51.145 [2024-11-28 12:38:21.103200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:eae000e0 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.103225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.103282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.103296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.103350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea8161 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.103364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.103419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ea3102ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.103431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.103487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.103501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:51.145 #51 NEW cov: 12585 ft: 15899 corp: 31/524b lim: 30 exec/s: 51 rss: 75Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:51.145 [2024-11-28 12:38:21.162388] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (534572) > buf size (4096) 00:07:51.145 [2024-11-28 12:38:21.162616] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a0a02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.162641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.145 #52 NEW cov: 12585 ft: 15911 corp: 32/535b lim: 30 exec/s: 52 rss: 75Mb L: 11/30 MS: 1 EraseBytes- 00:07:51.145 [2024-11-28 12:38:21.202542] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.145 [2024-11-28 12:38:21.202679] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.145 [2024-11-28 12:38:21.202795] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:51.145 [2024-11-28 12:38:21.203008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ca cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.203034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.203088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea0221 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.203102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.203157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.203172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.145 #53 NEW cov: 12585 ft: 15944 corp: 33/553b lim: 30 exec/s: 53 rss: 75Mb L: 18/30 MS: 1 ChangeByte- 00:07:51.145 [2024-11-28 12:38:21.262510] ctrlr.c:2669:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (36908) > buf size (4096) 00:07:51.145 [2024-11-28 12:38:21.262645] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xeaea 00:07:51.145 [2024-11-28 12:38:21.262872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:240a0066 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.262898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.145 [2024-11-28 12:38:21.262952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.145 [2024-11-28 12:38:21.262966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.405 #54 NEW cov: 12585 ft: 15956 corp: 34/566b lim: 30 exec/s: 54 rss: 75Mb L: 13/30 MS: 1 InsertByte- 00:07:51.405 [2024-11-28 12:38:21.322538] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.405 [2024-11-28 12:38:21.322685] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000eaea 00:07:51.405 [2024-11-28 12:38:21.322796] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.405 [2024-11-28 12:38:21.323021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.323047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.405 [2024-11-28 12:38:21.323104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea81ea cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.323118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.405 [2024-11-28 12:38:21.323174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.323187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.405 #55 NEW cov: 12585 ft: 15962 corp: 35/586b lim: 30 exec/s: 55 rss: 75Mb L: 20/30 MS: 1 ShuffleBytes- 00:07:51.405 [2024-11-28 12:38:21.362573] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200004aea 00:07:51.405 [2024-11-28 12:38:21.362708] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000eaea 00:07:51.405 [2024-11-28 12:38:21.362817] ctrlr.c:2657:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000a0a 00:07:51.405 [2024-11-28 12:38:21.363040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0aea02ee cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.363066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.405 [2024-11-28 12:38:21.363122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.363137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.405 [2024-11-28 12:38:21.363191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:eaea02ea cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.405 [2024-11-28 12:38:21.363205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.405 #56 NEW cov: 12585 ft: 15970 corp: 36/604b lim: 30 exec/s: 28 rss: 75Mb L: 18/30 MS: 1 ChangeBit- 00:07:51.405 #56 DONE cov: 12585 ft: 15970 corp: 36/604b lim: 30 exec/s: 28 rss: 75Mb 00:07:51.405 ###### Recommended dictionary. ###### 00:07:51.405 "f\000\000\000" # Uses: 2 00:07:51.405 "\377\377\377\377" # Uses: 0 00:07:51.405 ###### End of recommended dictionary. ###### 00:07:51.405 Done 56 runs in 2 second(s) 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 2 00:07:51.405 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4402 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:51.406 12:38:21 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:51.665 [2024-11-28 12:38:21.536598] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:51.665 [2024-11-28 12:38:21.536659] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid606783 ] 00:07:51.665 [2024-11-28 12:38:21.772411] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:51.924 [2024-11-28 12:38:21.818840] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:51.924 [2024-11-28 12:38:21.833982] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:51.924 [2024-11-28 12:38:21.886597] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:51.924 [2024-11-28 12:38:21.902712] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:51.924 INFO: Running with entropic power schedule (0xFF, 100). 00:07:51.924 INFO: Seed: 2161914309 00:07:51.924 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:51.924 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:51.924 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:51.924 INFO: A corpus is not provided, starting from an empty corpus 00:07:51.924 #2 INITED exec/s: 0 rss: 66Mb 00:07:51.924 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:51.924 This may also happen if the target rejected all inputs we tried so far 00:07:51.924 [2024-11-28 12:38:21.958515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.924 [2024-11-28 12:38:21.958543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.924 [2024-11-28 12:38:21.958618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.924 [2024-11-28 12:38:21.958632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.924 [2024-11-28 12:38:21.958687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.924 [2024-11-28 12:38:21.958700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.924 [2024-11-28 12:38:21.958757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.924 [2024-11-28 12:38:21.958771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.183 NEW_FUNC[1/716]: 0x4620b8 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:52.183 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:52.184 #3 NEW cov: 12274 ft: 12272 corp: 2/29b lim: 35 exec/s: 0 rss: 73Mb L: 28/28 MS: 1 InsertRepeatedBytes- 00:07:52.184 [2024-11-28 12:38:22.278302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.184 [2024-11-28 12:38:22.278339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.184 [2024-11-28 12:38:22.278400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.184 [2024-11-28 12:38:22.278416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.443 #4 NEW cov: 12387 ft: 13481 corp: 3/43b lim: 35 exec/s: 0 rss: 73Mb L: 14/28 MS: 1 EraseBytes- 00:07:52.443 [2024-11-28 12:38:22.338231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.338258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.338335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.338349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.443 #5 NEW cov: 12393 ft: 13694 corp: 4/62b lim: 35 exec/s: 0 rss: 73Mb L: 19/28 MS: 1 InsertRepeatedBytes- 00:07:52.443 [2024-11-28 12:38:22.398261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.398287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.398366] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.398381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.443 #6 NEW cov: 12478 ft: 13904 corp: 5/81b lim: 35 exec/s: 0 rss: 73Mb L: 19/28 MS: 1 ShuffleBytes- 00:07:52.443 [2024-11-28 12:38:22.458232] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:52.443 [2024-11-28 12:38:22.458579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.458605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.458664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.458678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.458736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff0000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.458752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.458810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.458824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.443 #7 NEW cov: 12489 ft: 14055 corp: 6/113b lim: 35 exec/s: 0 rss: 73Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:52.443 [2024-11-28 12:38:22.497943] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:52.443 [2024-11-28 12:38:22.498177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.498206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.443 #8 NEW cov: 12489 ft: 14401 corp: 7/122b lim: 35 exec/s: 0 rss: 73Mb L: 9/32 MS: 1 InsertRepeatedBytes- 00:07:52.443 [2024-11-28 12:38:22.538296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:fb00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.443 [2024-11-28 12:38:22.538321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.443 [2024-11-28 12:38:22.538379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.444 [2024-11-28 12:38:22.538393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.444 #9 NEW cov: 12489 ft: 14570 corp: 8/136b lim: 35 exec/s: 0 rss: 73Mb L: 14/32 MS: 1 ChangeBit- 00:07:52.703 [2024-11-28 12:38:22.578284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.578310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.578369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00acff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.578383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.703 #10 NEW cov: 12489 ft: 14631 corp: 9/156b lim: 35 exec/s: 0 rss: 73Mb L: 20/32 MS: 1 InsertByte- 00:07:52.703 #11 NEW cov: 12489 ft: 15150 corp: 10/175b lim: 35 exec/s: 0 rss: 74Mb L: 19/32 MS: 1 ChangeBinInt- 00:07:52.703 [2024-11-28 12:38:22.678353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.678379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.678456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:94ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.678477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.703 #12 NEW cov: 12489 ft: 15211 corp: 11/190b lim: 35 exec/s: 0 rss: 74Mb L: 15/32 MS: 1 InsertByte- 00:07:52.703 [2024-11-28 12:38:22.718602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.718627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.718701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.718717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.718775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.718788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.718848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff6700ff cdw11:67006767 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.718862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.703 #13 NEW cov: 12489 ft: 15223 corp: 12/222b lim: 35 exec/s: 0 rss: 74Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:07:52.703 [2024-11-28 12:38:22.758412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.758438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.758498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:0000ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.758512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.703 #14 NEW cov: 12489 ft: 15287 corp: 13/241b lim: 35 exec/s: 0 rss: 74Mb L: 19/32 MS: 1 CopyPart- 00:07:52.703 [2024-11-28 12:38:22.798393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.798421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.703 [2024-11-28 12:38:22.798482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.703 [2024-11-28 12:38:22.798496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.703 #20 NEW cov: 12489 ft: 15349 corp: 14/258b lim: 35 exec/s: 0 rss: 74Mb L: 17/32 MS: 1 EraseBytes- 00:07:52.963 [2024-11-28 12:38:22.838546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.838572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.838646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:94ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.838661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.838717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:0aff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.838732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.963 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:52.963 #21 NEW cov: 12512 ft: 15557 corp: 15/283b lim: 35 exec/s: 0 rss: 74Mb L: 25/32 MS: 1 CrossOver- 00:07:52.963 [2024-11-28 12:38:22.898727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.898752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.898812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.898826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.898885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.898898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.898955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ff0000ff cdw11:ff001cff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.898968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.963 #22 NEW cov: 12512 ft: 15586 corp: 16/311b lim: 35 exec/s: 0 rss: 74Mb L: 28/32 MS: 1 ChangeBinInt- 00:07:52.963 [2024-11-28 12:38:22.938452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.938481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.938557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.938571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.963 #23 NEW cov: 12512 ft: 15629 corp: 17/330b lim: 35 exec/s: 23 rss: 74Mb L: 19/32 MS: 1 CopyPart- 00:07:52.963 [2024-11-28 12:38:22.978449] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:52.963 [2024-11-28 12:38:22.978819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.978845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.978905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.978919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.978977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:ff00ff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.978993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:22.979050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:22.979064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.963 #24 NEW cov: 12512 ft: 15640 corp: 18/363b lim: 35 exec/s: 24 rss: 74Mb L: 33/33 MS: 1 CopyPart- 00:07:52.963 [2024-11-28 12:38:23.038705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:23.038730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:23.038790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:94ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:23.038804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.963 [2024-11-28 12:38:23.038861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.963 [2024-11-28 12:38:23.038874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.963 #25 NEW cov: 12512 ft: 15677 corp: 19/388b lim: 35 exec/s: 25 rss: 74Mb L: 25/33 MS: 1 ShuffleBytes- 00:07:53.223 [2024-11-28 12:38:23.098558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.098584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.098659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.098674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 #29 NEW cov: 12512 ft: 15686 corp: 20/404b lim: 35 exec/s: 29 rss: 74Mb L: 16/33 MS: 4 CopyPart-InsertByte-ChangeBit-InsertRepeatedBytes- 00:07:53.223 [2024-11-28 12:38:23.138578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.138603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.138662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.138678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 #30 NEW cov: 12512 ft: 15727 corp: 21/423b lim: 35 exec/s: 30 rss: 74Mb L: 19/33 MS: 1 InsertRepeatedBytes- 00:07:53.223 [2024-11-28 12:38:23.178711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a0a000a cdw11:be000add SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.178737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.178813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:bebe00be cdw11:be00bebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.178828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.178887] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:bebe00be cdw11:be00bebe SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.178901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.223 #35 NEW cov: 12512 ft: 15761 corp: 22/445b lim: 35 exec/s: 35 rss: 74Mb L: 22/33 MS: 5 ShuffleBytes-CopyPart-InsertByte-CopyPart-InsertRepeatedBytes- 00:07:53.223 [2024-11-28 12:38:23.218607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.218631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.218706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.218721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 #36 NEW cov: 12512 ft: 15770 corp: 23/463b lim: 35 exec/s: 36 rss: 74Mb L: 18/33 MS: 1 CopyPart- 00:07:53.223 [2024-11-28 12:38:23.258681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.258706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.258767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.258782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 #37 NEW cov: 12512 ft: 15777 corp: 24/479b lim: 35 exec/s: 37 rss: 74Mb L: 16/33 MS: 1 EraseBytes- 00:07:53.223 [2024-11-28 12:38:23.318665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.318691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.223 [2024-11-28 12:38:23.318766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:d1ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.223 [2024-11-28 12:38:23.318781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.223 #38 NEW cov: 12512 ft: 15807 corp: 25/493b lim: 35 exec/s: 38 rss: 75Mb L: 14/33 MS: 1 ChangeByte- 00:07:53.483 [2024-11-28 12:38:23.358659] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:53.483 [2024-11-28 12:38:23.359019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.483 [2024-11-28 12:38:23.359048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.483 [2024-11-28 12:38:23.359109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.483 [2024-11-28 12:38:23.359124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.483 [2024-11-28 12:38:23.359182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.483 [2024-11-28 12:38:23.359198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.483 [2024-11-28 12:38:23.359257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.359271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.484 #39 NEW cov: 12512 ft: 15815 corp: 26/521b lim: 35 exec/s: 39 rss: 75Mb L: 28/33 MS: 1 CMP- DE: "\001\000"- 00:07:53.484 [2024-11-28 12:38:23.398667] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:53.484 [2024-11-28 12:38:23.398933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.398959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.399020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00acff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.399035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.399093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00009b00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.399108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.484 #40 NEW cov: 12512 ft: 15824 corp: 27/542b lim: 35 exec/s: 40 rss: 75Mb L: 21/33 MS: 1 InsertByte- 00:07:53.484 [2024-11-28 12:38:23.459074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.459100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.459174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.459188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.459246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.459259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.459317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.459331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.484 #41 NEW cov: 12512 ft: 15863 corp: 28/573b lim: 35 exec/s: 41 rss: 75Mb L: 31/33 MS: 1 CopyPart- 00:07:53.484 [2024-11-28 12:38:23.498743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.498769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.498843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ff83 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.498858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.484 #42 NEW cov: 12512 ft: 15892 corp: 29/587b lim: 35 exec/s: 42 rss: 75Mb L: 14/33 MS: 1 ChangeByte- 00:07:53.484 [2024-11-28 12:38:23.538785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.538810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.538885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ff0000ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.538900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.484 #43 NEW cov: 12512 ft: 15897 corp: 30/606b lim: 35 exec/s: 43 rss: 75Mb L: 19/33 MS: 1 ShuffleBytes- 00:07:53.484 [2024-11-28 12:38:23.578941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.578966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.579027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.579041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.484 [2024-11-28 12:38:23.579100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d1ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.484 [2024-11-28 12:38:23.579114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.744 #44 NEW cov: 12512 ft: 15969 corp: 31/627b lim: 35 exec/s: 44 rss: 75Mb L: 21/33 MS: 1 CopyPart- 00:07:53.744 [2024-11-28 12:38:23.638728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.638755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.744 #45 NEW cov: 12512 ft: 16031 corp: 32/637b lim: 35 exec/s: 45 rss: 75Mb L: 10/33 MS: 1 EraseBytes- 00:07:53.744 [2024-11-28 12:38:23.678995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.679021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.744 [2024-11-28 12:38:23.679081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:94ff002a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.679095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.744 [2024-11-28 12:38:23.679152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:ff000aff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.679166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.744 #46 NEW cov: 12512 ft: 16080 corp: 33/662b lim: 35 exec/s: 46 rss: 75Mb L: 25/33 MS: 1 ChangeByte- 00:07:53.744 [2024-11-28 12:38:23.738932] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ef cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.738957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.744 [2024-11-28 12:38:23.739029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.739044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.744 #47 NEW cov: 12512 ft: 16100 corp: 34/681b lim: 35 exec/s: 47 rss: 75Mb L: 19/33 MS: 1 ChangeBit- 00:07:53.744 [2024-11-28 12:38:23.778762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.778787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.744 #48 NEW cov: 12512 ft: 16135 corp: 35/694b lim: 35 exec/s: 48 rss: 75Mb L: 13/33 MS: 1 EraseBytes- 00:07:53.744 [2024-11-28 12:38:23.839081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.839106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.744 [2024-11-28 12:38:23.839164] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:7a00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.839177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.744 [2024-11-28 12:38:23.839232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:d1ff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.744 [2024-11-28 12:38:23.839246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.004 #49 NEW cov: 12512 ft: 16170 corp: 36/715b lim: 35 exec/s: 49 rss: 75Mb L: 21/33 MS: 1 ChangeByte- 00:07:54.004 [2024-11-28 12:38:23.898900] ctrlr.c:2752:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:54.004 [2024-11-28 12:38:23.899146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff00ff cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.004 [2024-11-28 12:38:23.899170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.004 [2024-11-28 12:38:23.899229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:ffff00ff cdw11:ff00acff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.004 [2024-11-28 12:38:23.899243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.004 [2024-11-28 12:38:23.899301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00009b32 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.004 [2024-11-28 12:38:23.899317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.004 #50 NEW cov: 12512 ft: 16189 corp: 37/737b lim: 35 exec/s: 25 rss: 75Mb L: 22/33 MS: 1 InsertByte- 00:07:54.004 #50 DONE cov: 12512 ft: 16189 corp: 37/737b lim: 35 exec/s: 25 rss: 75Mb 00:07:54.004 ###### Recommended dictionary. ###### 00:07:54.004 "\001\000" # Uses: 0 00:07:54.004 ###### End of recommended dictionary. ###### 00:07:54.004 Done 50 runs in 2 second(s) 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 3 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4403 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:54.004 12:38:24 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:54.004 [2024-11-28 12:38:24.072778] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:54.004 [2024-11-28 12:38:24.072829] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid607142 ] 00:07:54.264 [2024-11-28 12:38:24.313442] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:54.264 [2024-11-28 12:38:24.358822] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:54.264 [2024-11-28 12:38:24.373790] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:54.522 [2024-11-28 12:38:24.426526] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:54.522 [2024-11-28 12:38:24.442643] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:54.522 INFO: Running with entropic power schedule (0xFF, 100). 00:07:54.522 INFO: Seed: 404955314 00:07:54.523 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:54.523 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:54.523 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:54.523 INFO: A corpus is not provided, starting from an empty corpus 00:07:54.523 #2 INITED exec/s: 0 rss: 66Mb 00:07:54.523 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:54.523 This may also happen if the target rejected all inputs we tried so far 00:07:54.782 NEW_FUNC[1/705]: 0x463d98 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:54.782 NEW_FUNC[2/705]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:54.782 #3 NEW cov: 12169 ft: 12166 corp: 2/7b lim: 20 exec/s: 0 rss: 73Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:07:54.782 #4 NEW cov: 12296 ft: 13032 corp: 3/18b lim: 20 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:54.782 #5 NEW cov: 12302 ft: 13238 corp: 4/25b lim: 20 exec/s: 0 rss: 73Mb L: 7/11 MS: 1 InsertByte- 00:07:55.041 #6 NEW cov: 12387 ft: 13544 corp: 5/36b lim: 20 exec/s: 0 rss: 73Mb L: 11/11 MS: 1 ChangeByte- 00:07:55.041 #7 NEW cov: 12404 ft: 14034 corp: 6/52b lim: 20 exec/s: 0 rss: 73Mb L: 16/16 MS: 1 InsertRepeatedBytes- 00:07:55.041 #8 NEW cov: 12404 ft: 14252 corp: 7/57b lim: 20 exec/s: 0 rss: 73Mb L: 5/16 MS: 1 InsertRepeatedBytes- 00:07:55.041 #13 NEW cov: 12404 ft: 14382 corp: 8/63b lim: 20 exec/s: 0 rss: 74Mb L: 6/16 MS: 5 CopyPart-ChangeByte-ShuffleBytes-InsertByte-CrossOver- 00:07:55.041 #14 NEW cov: 12404 ft: 14416 corp: 9/69b lim: 20 exec/s: 0 rss: 74Mb L: 6/16 MS: 1 ChangeByte- 00:07:55.298 #15 NEW cov: 12404 ft: 14474 corp: 10/74b lim: 20 exec/s: 0 rss: 74Mb L: 5/16 MS: 1 ChangeBit- 00:07:55.298 #16 NEW cov: 12404 ft: 14578 corp: 11/79b lim: 20 exec/s: 0 rss: 74Mb L: 5/16 MS: 1 CMP- DE: "\004\000"- 00:07:55.298 #20 NEW cov: 12404 ft: 14606 corp: 12/84b lim: 20 exec/s: 0 rss: 74Mb L: 5/16 MS: 4 ChangeBit-CrossOver-PersAutoDict-CopyPart- DE: "\004\000"- 00:07:55.298 #21 NEW cov: 12404 ft: 14635 corp: 13/89b lim: 20 exec/s: 0 rss: 74Mb L: 5/16 MS: 1 ChangeBit- 00:07:55.298 #22 NEW cov: 12404 ft: 14655 corp: 14/96b lim: 20 exec/s: 0 rss: 74Mb L: 7/16 MS: 1 CopyPart- 00:07:55.298 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:55.298 #23 NEW cov: 12427 ft: 14727 corp: 15/102b lim: 20 exec/s: 0 rss: 74Mb L: 6/16 MS: 1 CrossOver- 00:07:55.557 #24 NEW cov: 12427 ft: 14749 corp: 16/111b lim: 20 exec/s: 0 rss: 74Mb L: 9/16 MS: 1 CrossOver- 00:07:55.557 #25 NEW cov: 12427 ft: 14768 corp: 17/117b lim: 20 exec/s: 25 rss: 74Mb L: 6/16 MS: 1 ChangeBinInt- 00:07:55.557 #26 NEW cov: 12427 ft: 14794 corp: 18/133b lim: 20 exec/s: 26 rss: 74Mb L: 16/16 MS: 1 ChangeBit- 00:07:55.557 #27 NEW cov: 12427 ft: 14820 corp: 19/138b lim: 20 exec/s: 27 rss: 74Mb L: 5/16 MS: 1 ChangeBit- 00:07:55.557 #28 NEW cov: 12427 ft: 14826 corp: 20/144b lim: 20 exec/s: 28 rss: 74Mb L: 6/16 MS: 1 InsertByte- 00:07:55.816 #29 NEW cov: 12427 ft: 14834 corp: 21/149b lim: 20 exec/s: 29 rss: 74Mb L: 5/16 MS: 1 ChangeByte- 00:07:55.816 #30 NEW cov: 12427 ft: 14841 corp: 22/155b lim: 20 exec/s: 30 rss: 74Mb L: 6/16 MS: 1 ChangeByte- 00:07:55.816 #31 NEW cov: 12427 ft: 14854 corp: 23/161b lim: 20 exec/s: 31 rss: 74Mb L: 6/16 MS: 1 ChangeBinInt- 00:07:55.816 #32 NEW cov: 12427 ft: 14881 corp: 24/167b lim: 20 exec/s: 32 rss: 74Mb L: 6/16 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:55.816 #33 NEW cov: 12431 ft: 15025 corp: 25/180b lim: 20 exec/s: 33 rss: 74Mb L: 13/16 MS: 1 InsertRepeatedBytes- 00:07:56.075 #34 NEW cov: 12431 ft: 15047 corp: 26/187b lim: 20 exec/s: 34 rss: 74Mb L: 7/16 MS: 1 ChangeBit- 00:07:56.075 #35 NEW cov: 12431 ft: 15105 corp: 27/206b lim: 20 exec/s: 35 rss: 74Mb L: 19/19 MS: 1 CopyPart- 00:07:56.075 #36 NEW cov: 12431 ft: 15127 corp: 28/223b lim: 20 exec/s: 36 rss: 75Mb L: 17/19 MS: 1 InsertByte- 00:07:56.075 #37 NEW cov: 12431 ft: 15135 corp: 29/230b lim: 20 exec/s: 37 rss: 75Mb L: 7/19 MS: 1 PersAutoDict- DE: "\004\000"- 00:07:56.075 #38 NEW cov: 12431 ft: 15147 corp: 30/249b lim: 20 exec/s: 38 rss: 75Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:56.335 #39 NEW cov: 12431 ft: 15174 corp: 31/254b lim: 20 exec/s: 39 rss: 75Mb L: 5/19 MS: 1 ShuffleBytes- 00:07:56.335 #40 NEW cov: 12431 ft: 15189 corp: 32/261b lim: 20 exec/s: 40 rss: 75Mb L: 7/19 MS: 1 InsertByte- 00:07:56.335 #41 NEW cov: 12431 ft: 15194 corp: 33/269b lim: 20 exec/s: 41 rss: 75Mb L: 8/19 MS: 1 CopyPart- 00:07:56.335 #42 NEW cov: 12431 ft: 15222 corp: 34/278b lim: 20 exec/s: 42 rss: 75Mb L: 9/19 MS: 1 ChangeByte- 00:07:56.335 #43 NEW cov: 12431 ft: 15300 corp: 35/288b lim: 20 exec/s: 43 rss: 75Mb L: 10/19 MS: 1 CopyPart- 00:07:56.335 #44 NEW cov: 12431 ft: 15334 corp: 36/294b lim: 20 exec/s: 44 rss: 75Mb L: 6/19 MS: 1 InsertByte- 00:07:56.595 #45 NEW cov: 12431 ft: 15387 corp: 37/304b lim: 20 exec/s: 22 rss: 75Mb L: 10/19 MS: 1 ChangeByte- 00:07:56.595 #45 DONE cov: 12431 ft: 15387 corp: 37/304b lim: 20 exec/s: 22 rss: 75Mb 00:07:56.595 ###### Recommended dictionary. ###### 00:07:56.595 "\004\000" # Uses: 3 00:07:56.595 ###### End of recommended dictionary. ###### 00:07:56.595 Done 45 runs in 2 second(s) 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 4 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4404 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:56.595 12:38:26 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:07:56.595 [2024-11-28 12:38:26.661952] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:56.595 [2024-11-28 12:38:26.662021] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid607501 ] 00:07:57.165 [2024-11-28 12:38:26.983862] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:57.165 [2024-11-28 12:38:27.030199] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.165 [2024-11-28 12:38:27.050644] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.165 [2024-11-28 12:38:27.103308] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.165 [2024-11-28 12:38:27.119414] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:57.165 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.165 INFO: Seed: 3082968684 00:07:57.165 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:57.165 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:57.165 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:57.165 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.165 #2 INITED exec/s: 0 rss: 66Mb 00:07:57.165 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.165 This may also happen if the target rejected all inputs we tried so far 00:07:57.165 [2024-11-28 12:38:27.185298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.165 [2024-11-28 12:38:27.185330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.165 [2024-11-28 12:38:27.185402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.165 [2024-11-28 12:38:27.185416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.165 [2024-11-28 12:38:27.185477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.165 [2024-11-28 12:38:27.185491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.165 [2024-11-28 12:38:27.185544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.165 [2024-11-28 12:38:27.185558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.424 NEW_FUNC[1/717]: 0x464e98 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:57.424 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:57.424 #4 NEW cov: 12278 ft: 12279 corp: 2/33b lim: 35 exec/s: 0 rss: 73Mb L: 32/32 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:57.424 [2024-11-28 12:38:27.525627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.424 [2024-11-28 12:38:27.525687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.424 [2024-11-28 12:38:27.525771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.424 [2024-11-28 12:38:27.525797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.424 [2024-11-28 12:38:27.525876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.424 [2024-11-28 12:38:27.525901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.424 [2024-11-28 12:38:27.525978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.424 [2024-11-28 12:38:27.526003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.684 #10 NEW cov: 12408 ft: 13036 corp: 3/66b lim: 35 exec/s: 0 rss: 73Mb L: 33/33 MS: 1 CopyPart- 00:07:57.684 [2024-11-28 12:38:27.595328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.595353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.595408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.595422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.595479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.595509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.595565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.595579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.684 #11 NEW cov: 12414 ft: 13222 corp: 4/98b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 ShuffleBytes- 00:07:57.684 [2024-11-28 12:38:27.635306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.635334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.635405] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.635420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.635478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.635492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.635545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.635558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.684 #12 NEW cov: 12499 ft: 13592 corp: 5/130b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 ChangeByte- 00:07:57.684 [2024-11-28 12:38:27.675351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.675379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.675432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.675447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.675501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.675516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.684 [2024-11-28 12:38:27.675570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.675583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.684 #13 NEW cov: 12499 ft: 13675 corp: 6/162b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 ChangeBit- 00:07:57.684 [2024-11-28 12:38:27.735367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.684 [2024-11-28 12:38:27.735392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.735463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.735483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.735539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.735553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.735615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.735628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.685 #14 NEW cov: 12502 ft: 13930 corp: 7/194b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 CopyPart- 00:07:57.685 [2024-11-28 12:38:27.775371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.775397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.775469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.775492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.775546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.775560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.685 [2024-11-28 12:38:27.775623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.685 [2024-11-28 12:38:27.775637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.944 #15 NEW cov: 12502 ft: 14075 corp: 8/226b lim: 35 exec/s: 0 rss: 73Mb L: 32/33 MS: 1 ShuffleBytes- 00:07:57.944 [2024-11-28 12:38:27.835058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.835083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.835139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.835153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.945 #16 NEW cov: 12502 ft: 14448 corp: 9/244b lim: 35 exec/s: 0 rss: 74Mb L: 18/33 MS: 1 EraseBytes- 00:07:57.945 [2024-11-28 12:38:27.895379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.895405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.895459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.895477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.895532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.895546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.895604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.895617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.945 #17 NEW cov: 12502 ft: 14508 corp: 10/277b lim: 35 exec/s: 0 rss: 74Mb L: 33/33 MS: 1 CopyPart- 00:07:57.945 [2024-11-28 12:38:27.935485] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.935512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.935567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.935581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.935634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.935648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.935701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.935716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.945 #18 NEW cov: 12502 ft: 14556 corp: 11/309b lim: 35 exec/s: 0 rss: 74Mb L: 32/33 MS: 1 ShuffleBytes- 00:07:57.945 [2024-11-28 12:38:27.995446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5210001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.995478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.995550] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.995565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.995628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.995642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:27.995694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:27.995708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.945 #19 NEW cov: 12502 ft: 14585 corp: 12/341b lim: 35 exec/s: 0 rss: 74Mb L: 32/33 MS: 1 ChangeByte- 00:07:57.945 [2024-11-28 12:38:28.055177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:28.055203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.945 [2024-11-28 12:38:28.055257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.945 [2024-11-28 12:38:28.055271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.204 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:07:58.204 #20 NEW cov: 12525 ft: 14647 corp: 13/358b lim: 35 exec/s: 0 rss: 74Mb L: 17/33 MS: 1 EraseBytes- 00:07:58.205 [2024-11-28 12:38:28.115468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.115497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.115566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.115581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.115635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.115648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.115702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.115715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.205 #21 NEW cov: 12525 ft: 14690 corp: 14/389b lim: 35 exec/s: 0 rss: 74Mb L: 31/33 MS: 1 EraseBytes- 00:07:58.205 [2024-11-28 12:38:28.155163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.155188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.155257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5520001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.155271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.205 #22 NEW cov: 12525 ft: 14727 corp: 15/407b lim: 35 exec/s: 22 rss: 74Mb L: 18/33 MS: 1 ChangeBinInt- 00:07:58.205 [2024-11-28 12:38:28.215502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.215527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.215581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.215595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.215649] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.215662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.215716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b530b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.215729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.205 #23 NEW cov: 12525 ft: 14750 corp: 16/441b lim: 35 exec/s: 23 rss: 75Mb L: 34/34 MS: 1 InsertByte- 00:07:58.205 [2024-11-28 12:38:28.275753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.275782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.275837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.275850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.275902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.275916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.275968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.275981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.276033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:b5b5b5b5 cdw11:b50a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.276046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.205 #24 NEW cov: 12525 ft: 14812 corp: 17/476b lim: 35 exec/s: 24 rss: 75Mb L: 35/35 MS: 1 CrossOver- 00:07:58.205 [2024-11-28 12:38:28.315567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.315592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.315646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.315660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.315713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.315727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.205 [2024-11-28 12:38:28.315780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.205 [2024-11-28 12:38:28.315793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.464 #25 NEW cov: 12525 ft: 14865 corp: 18/508b lim: 35 exec/s: 25 rss: 75Mb L: 32/35 MS: 1 ShuffleBytes- 00:07:58.464 [2024-11-28 12:38:28.355275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.355299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.355370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.355384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.464 #26 NEW cov: 12525 ft: 14899 corp: 19/524b lim: 35 exec/s: 26 rss: 75Mb L: 16/35 MS: 1 EraseBytes- 00:07:58.464 [2024-11-28 12:38:28.415312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.415339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.415395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b50a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.415409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.464 #27 NEW cov: 12525 ft: 14917 corp: 20/538b lim: 35 exec/s: 27 rss: 75Mb L: 14/35 MS: 1 EraseBytes- 00:07:58.464 [2024-11-28 12:38:28.455677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.455703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.455757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b5b5b5 cdw11:b5d50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.455771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.455825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.455838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.455891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.455904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.464 #28 NEW cov: 12525 ft: 14997 corp: 21/571b lim: 35 exec/s: 28 rss: 75Mb L: 33/35 MS: 1 InsertByte- 00:07:58.464 [2024-11-28 12:38:28.515493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.515517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.515571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.515585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.515639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.515652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.464 #29 NEW cov: 12525 ft: 15219 corp: 22/593b lim: 35 exec/s: 29 rss: 75Mb L: 22/35 MS: 1 EraseBytes- 00:07:58.464 [2024-11-28 12:38:28.555712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:23b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.555736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.555806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.555821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.555874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.555891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.464 [2024-11-28 12:38:28.555944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.464 [2024-11-28 12:38:28.555959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.464 #30 NEW cov: 12525 ft: 15241 corp: 23/626b lim: 35 exec/s: 30 rss: 75Mb L: 33/35 MS: 1 InsertByte- 00:07:58.723 [2024-11-28 12:38:28.595708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.595732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.595802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b4b5b5b5 cdw11:b5d50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.595816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.595867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.595880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.595933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.595946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.723 #31 NEW cov: 12525 ft: 15323 corp: 24/659b lim: 35 exec/s: 31 rss: 75Mb L: 33/35 MS: 1 ChangeByte- 00:07:58.723 [2024-11-28 12:38:28.655771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.655795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.655866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:beb5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.655881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.655934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.723 [2024-11-28 12:38:28.655947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.723 [2024-11-28 12:38:28.656001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.656015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.724 #32 NEW cov: 12525 ft: 15327 corp: 25/693b lim: 35 exec/s: 32 rss: 75Mb L: 34/35 MS: 1 InsertByte- 00:07:58.724 [2024-11-28 12:38:28.695422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.695447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.695526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b532b5b5 cdw11:b5b50002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.695544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.724 #33 NEW cov: 12525 ft: 15377 corp: 26/712b lim: 35 exec/s: 33 rss: 75Mb L: 19/35 MS: 1 InsertByte- 00:07:58.724 [2024-11-28 12:38:28.755648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b1b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.755673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.755744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.755758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.755810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.755823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.724 #34 NEW cov: 12525 ft: 15393 corp: 27/734b lim: 35 exec/s: 34 rss: 75Mb L: 22/35 MS: 1 ChangeBinInt- 00:07:58.724 [2024-11-28 12:38:28.815796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.815821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.815890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:beb5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.815904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.815958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b4b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.815971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.724 [2024-11-28 12:38:28.816022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.724 [2024-11-28 12:38:28.816036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.983 #35 NEW cov: 12525 ft: 15394 corp: 28/768b lim: 35 exec/s: 35 rss: 75Mb L: 34/35 MS: 1 ChangeBit- 00:07:58.983 [2024-11-28 12:38:28.875677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.875703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.875773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.875787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.875838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.875852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.983 #36 NEW cov: 12525 ft: 15432 corp: 29/793b lim: 35 exec/s: 36 rss: 75Mb L: 25/35 MS: 1 EraseBytes- 00:07:58.983 [2024-11-28 12:38:28.915892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5210001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.915920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.915990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.916004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.916056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.916069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.916120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.916134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.983 #37 NEW cov: 12525 ft: 15459 corp: 30/825b lim: 35 exec/s: 37 rss: 75Mb L: 32/35 MS: 1 CrossOver- 00:07:58.983 [2024-11-28 12:38:28.976055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.976080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.976150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.976164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.976219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.976232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.976287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:30b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.976300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:28.976353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:b5b5b5b5 cdw11:b50a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:28.976367] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.983 #38 NEW cov: 12525 ft: 15463 corp: 31/860b lim: 35 exec/s: 38 rss: 75Mb L: 35/35 MS: 1 CopyPart- 00:07:58.983 [2024-11-28 12:38:29.035950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.035976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.036048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.036063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.036116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.036133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.036185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.036199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.983 #39 NEW cov: 12525 ft: 15502 corp: 32/892b lim: 35 exec/s: 39 rss: 75Mb L: 32/35 MS: 1 CrossOver- 00:07:58.983 [2024-11-28 12:38:29.095992] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:23b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.096018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.096072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.096086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.096137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.096151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.983 [2024-11-28 12:38:29.096204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:58.983 [2024-11-28 12:38:29.096217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.243 #40 NEW cov: 12525 ft: 15530 corp: 33/925b lim: 35 exec/s: 40 rss: 75Mb L: 33/35 MS: 1 ChangeBit- 00:07:59.243 [2024-11-28 12:38:29.156218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:b5b5b5b5 cdw11:b55d0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.243 [2024-11-28 12:38:29.156245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.243 [2024-11-28 12:38:29.156315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.243 [2024-11-28 12:38:29.156330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.243 [2024-11-28 12:38:29.156383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.243 [2024-11-28 12:38:29.156397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.243 [2024-11-28 12:38:29.156450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:b5b5b5b5 cdw11:b5b50001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.243 [2024-11-28 12:38:29.156464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.243 [2024-11-28 12:38:29.156521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:b5b5b5b5 cdw11:b50a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.243 [2024-11-28 12:38:29.156535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.243 #41 NEW cov: 12525 ft: 15557 corp: 34/960b lim: 35 exec/s: 20 rss: 76Mb L: 35/35 MS: 1 ChangeByte- 00:07:59.243 #41 DONE cov: 12525 ft: 15557 corp: 34/960b lim: 35 exec/s: 20 rss: 76Mb 00:07:59.243 Done 41 runs in 2 second(s) 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 5 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4405 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:59.243 12:38:29 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:07:59.243 [2024-11-28 12:38:29.347245] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:07:59.243 [2024-11-28 12:38:29.347321] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid607864 ] 00:07:59.809 [2024-11-28 12:38:29.668348] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:07:59.809 [2024-11-28 12:38:29.713742] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:59.810 [2024-11-28 12:38:29.733981] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:07:59.810 [2024-11-28 12:38:29.786597] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:59.810 [2024-11-28 12:38:29.802705] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:59.810 INFO: Running with entropic power schedule (0xFF, 100). 00:07:59.810 INFO: Seed: 1471990018 00:07:59.810 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:07:59.810 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:07:59.810 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:59.810 INFO: A corpus is not provided, starting from an empty corpus 00:07:59.810 #2 INITED exec/s: 0 rss: 66Mb 00:07:59.810 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:59.810 This may also happen if the target rejected all inputs we tried so far 00:07:59.810 [2024-11-28 12:38:29.858051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:37fb0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:59.810 [2024-11-28 12:38:29.858080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.068 NEW_FUNC[1/717]: 0x467038 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:00.068 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:00.068 #8 NEW cov: 12307 ft: 12291 corp: 2/10b lim: 45 exec/s: 0 rss: 73Mb L: 9/9 MS: 1 CMP- DE: "CN\2167\373\275J\000"- 00:08:00.068 [2024-11-28 12:38:30.178378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f0f0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.068 [2024-11-28 12:38:30.178419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.068 [2024-11-28 12:38:30.178479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.068 [2024-11-28 12:38:30.178494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.326 #11 NEW cov: 12420 ft: 13572 corp: 3/30b lim: 45 exec/s: 0 rss: 73Mb L: 20/20 MS: 3 InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:00.326 [2024-11-28 12:38:30.218588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5b3440a cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.218616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.326 [2024-11-28 12:38:30.218687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.218700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.326 [2024-11-28 12:38:30.218758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.218771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.326 [2024-11-28 12:38:30.218825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.218838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.326 #15 NEW cov: 12426 ft: 14115 corp: 4/67b lim: 45 exec/s: 0 rss: 73Mb L: 37/37 MS: 4 ShuffleBytes-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:00.326 [2024-11-28 12:38:30.258146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8e37434e cdw11:fbbd0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.258171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.326 #20 NEW cov: 12511 ft: 14457 corp: 5/77b lim: 45 exec/s: 0 rss: 74Mb L: 10/37 MS: 5 CopyPart-ChangeBit-CrossOver-CrossOver-PersAutoDict- DE: "CN\2167\373\275J\000"- 00:08:00.326 [2024-11-28 12:38:30.298618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.298644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.326 [2024-11-28 12:38:30.298699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.298712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.326 [2024-11-28 12:38:30.298765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.326 [2024-11-28 12:38:30.298781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.327 [2024-11-28 12:38:30.298835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.327 [2024-11-28 12:38:30.298848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.327 #21 NEW cov: 12511 ft: 14541 corp: 6/120b lim: 45 exec/s: 0 rss: 74Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:08:00.327 [2024-11-28 12:38:30.358351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f0f0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.327 [2024-11-28 12:38:30.358378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.327 [2024-11-28 12:38:30.358431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.327 [2024-11-28 12:38:30.358445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.327 #22 NEW cov: 12511 ft: 14626 corp: 7/138b lim: 45 exec/s: 0 rss: 74Mb L: 18/43 MS: 1 EraseBytes- 00:08:00.327 [2024-11-28 12:38:30.398161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:37fb0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.327 [2024-11-28 12:38:30.398186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.327 #23 NEW cov: 12511 ft: 14833 corp: 8/148b lim: 45 exec/s: 0 rss: 74Mb L: 10/43 MS: 1 InsertByte- 00:08:00.585 [2024-11-28 12:38:30.458181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:38fb0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.585 [2024-11-28 12:38:30.458206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.585 #24 NEW cov: 12511 ft: 14917 corp: 9/158b lim: 45 exec/s: 0 rss: 74Mb L: 10/43 MS: 1 ChangeASCIIInt- 00:08:00.585 [2024-11-28 12:38:30.518715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e0a0a43 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.585 [2024-11-28 12:38:30.518742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.585 [2024-11-28 12:38:30.518797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.585 [2024-11-28 12:38:30.518810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.585 [2024-11-28 12:38:30.518864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.585 [2024-11-28 12:38:30.518877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.585 [2024-11-28 12:38:30.518931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f0f400f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.585 [2024-11-28 12:38:30.518943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.585 #25 NEW cov: 12511 ft: 14993 corp: 10/199b lim: 45 exec/s: 0 rss: 74Mb L: 41/43 MS: 1 CrossOver- 00:08:00.585 [2024-11-28 12:38:30.558447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f0f0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.586 [2024-11-28 12:38:30.558480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.586 [2024-11-28 12:38:30.558535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:7e0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.586 [2024-11-28 12:38:30.558549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.586 #26 NEW cov: 12511 ft: 15033 corp: 11/220b lim: 45 exec/s: 0 rss: 74Mb L: 21/43 MS: 1 InsertByte- 00:08:00.586 [2024-11-28 12:38:30.598428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:400f0ac3 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.586 [2024-11-28 12:38:30.598454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.586 [2024-11-28 12:38:30.598531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.586 [2024-11-28 12:38:30.598546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.586 #32 NEW cov: 12511 ft: 15056 corp: 12/239b lim: 45 exec/s: 0 rss: 74Mb L: 19/43 MS: 1 InsertByte- 00:08:00.586 [2024-11-28 12:38:30.658351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8e01434e cdw11:00370007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.586 [2024-11-28 12:38:30.658376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.586 #33 NEW cov: 12511 ft: 15068 corp: 13/251b lim: 45 exec/s: 0 rss: 74Mb L: 12/43 MS: 1 CMP- DE: "\001\000"- 00:08:00.844 [2024-11-28 12:38:30.718869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5b3440a cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.844 [2024-11-28 12:38:30.718894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.844 [2024-11-28 12:38:30.718948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b3b3b3b1 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.844 [2024-11-28 12:38:30.718962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.844 [2024-11-28 12:38:30.719033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.844 [2024-11-28 12:38:30.719047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.844 [2024-11-28 12:38:30.719100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.844 [2024-11-28 12:38:30.719113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.845 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:00.845 #34 NEW cov: 12534 ft: 15113 corp: 14/288b lim: 45 exec/s: 0 rss: 74Mb L: 37/43 MS: 1 ChangeBit- 00:08:00.845 [2024-11-28 12:38:30.778544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.778569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.845 [2024-11-28 12:38:30.778624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.778638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.845 #35 NEW cov: 12534 ft: 15164 corp: 15/309b lim: 45 exec/s: 0 rss: 74Mb L: 21/43 MS: 1 InsertByte- 00:08:00.845 [2024-11-28 12:38:30.818878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.818903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.845 [2024-11-28 12:38:30.818973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.818987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.845 [2024-11-28 12:38:30.819044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.819057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.845 [2024-11-28 12:38:30.819111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.819124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.845 #36 NEW cov: 12534 ft: 15230 corp: 16/352b lim: 45 exec/s: 36 rss: 74Mb L: 43/43 MS: 1 ChangeBit- 00:08:00.845 [2024-11-28 12:38:30.878589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:370a0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.878615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.845 [2024-11-28 12:38:30.878672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.878686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.845 #37 NEW cov: 12534 ft: 15244 corp: 17/373b lim: 45 exec/s: 37 rss: 74Mb L: 21/43 MS: 1 CrossOver- 00:08:00.845 [2024-11-28 12:38:30.918430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.845 [2024-11-28 12:38:30.918454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.845 #38 NEW cov: 12534 ft: 15269 corp: 18/386b lim: 45 exec/s: 38 rss: 74Mb L: 13/43 MS: 1 EraseBytes- 00:08:01.132 [2024-11-28 12:38:30.978492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:0a430002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:30.978518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.132 #39 NEW cov: 12534 ft: 15319 corp: 19/399b lim: 45 exec/s: 39 rss: 74Mb L: 13/43 MS: 1 CopyPart- 00:08:01.132 [2024-11-28 12:38:31.038476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.038501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.132 #40 NEW cov: 12534 ft: 15400 corp: 20/412b lim: 45 exec/s: 40 rss: 74Mb L: 13/43 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.132 [2024-11-28 12:38:31.078483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.078507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.132 #41 NEW cov: 12534 ft: 15414 corp: 21/424b lim: 45 exec/s: 41 rss: 75Mb L: 12/43 MS: 1 EraseBytes- 00:08:01.132 [2024-11-28 12:38:31.138528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.138553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.132 #42 NEW cov: 12534 ft: 15420 corp: 22/436b lim: 45 exec/s: 42 rss: 75Mb L: 12/43 MS: 1 ChangeBit- 00:08:01.132 [2024-11-28 12:38:31.199050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e0a0a43 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.199075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.132 [2024-11-28 12:38:31.199146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.199161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.132 [2024-11-28 12:38:31.199215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.199228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.132 [2024-11-28 12:38:31.199281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f0f400f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.132 [2024-11-28 12:38:31.199295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.132 #43 NEW cov: 12534 ft: 15492 corp: 23/477b lim: 45 exec/s: 43 rss: 75Mb L: 41/43 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.456 [2024-11-28 12:38:31.259107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:d5b3440a cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.259133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.259188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:b3b3b3b1 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.259202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.259254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.259267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.259321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:b3b3b3b3 cdw11:b3b30005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.259334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.456 #44 NEW cov: 12534 ft: 15506 corp: 24/514b lim: 45 exec/s: 44 rss: 75Mb L: 37/43 MS: 1 ShuffleBytes- 00:08:01.456 [2024-11-28 12:38:31.318939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f0f0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.318964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.319019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:7e0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.319036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.319090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.319103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.456 #45 NEW cov: 12534 ft: 15724 corp: 25/543b lim: 45 exec/s: 45 rss: 75Mb L: 29/43 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:01.456 [2024-11-28 12:38:31.378635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.378659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 #46 NEW cov: 12534 ft: 15765 corp: 26/560b lim: 45 exec/s: 46 rss: 75Mb L: 17/43 MS: 1 EraseBytes- 00:08:01.456 [2024-11-28 12:38:31.418968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a40 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.418993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.419047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:2c0fff0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.419060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.419111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.419124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.456 #47 NEW cov: 12534 ft: 15787 corp: 27/589b lim: 45 exec/s: 47 rss: 75Mb L: 29/43 MS: 1 InsertRepeatedBytes- 00:08:01.456 [2024-11-28 12:38:31.458644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:4e8e0a43 cdw11:0a430002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.458668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 #48 NEW cov: 12534 ft: 15793 corp: 28/602b lim: 45 exec/s: 48 rss: 75Mb L: 13/43 MS: 1 ChangeBinInt- 00:08:01.456 [2024-11-28 12:38:31.518872] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:400f0ac3 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.518897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.456 [2024-11-28 12:38:31.518952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.456 [2024-11-28 12:38:31.518966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.784 #49 NEW cov: 12534 ft: 15835 corp: 29/621b lim: 45 exec/s: 49 rss: 75Mb L: 19/43 MS: 1 ShuffleBytes- 00:08:01.784 [2024-11-28 12:38:31.578734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:224e0a43 cdw11:8e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.578760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.784 #50 NEW cov: 12534 ft: 15842 corp: 30/635b lim: 45 exec/s: 50 rss: 75Mb L: 14/43 MS: 1 InsertByte- 00:08:01.784 [2024-11-28 12:38:31.618728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:8e374d4e cdw11:fbbd0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.618756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.784 #51 NEW cov: 12534 ft: 15869 corp: 31/645b lim: 45 exec/s: 51 rss: 75Mb L: 10/43 MS: 1 ChangeBinInt- 00:08:01.784 [2024-11-28 12:38:31.658903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.658929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.784 [2024-11-28 12:38:31.658985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.658999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.784 #52 NEW cov: 12534 ft: 15881 corp: 32/666b lim: 45 exec/s: 52 rss: 75Mb L: 21/43 MS: 1 ChangeByte- 00:08:01.784 [2024-11-28 12:38:31.699248] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0f2c0a40 cdw11:79790003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.699273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.784 [2024-11-28 12:38:31.699342] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:79797979 cdw11:79790003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.699356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.784 [2024-11-28 12:38:31.699407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:79797979 cdw11:790f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.784 [2024-11-28 12:38:31.699417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.784 [2024-11-28 12:38:31.699476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.785 [2024-11-28 12:38:31.699489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.785 #53 NEW cov: 12534 ft: 15906 corp: 33/706b lim: 45 exec/s: 53 rss: 75Mb L: 40/43 MS: 1 InsertRepeatedBytes- 00:08:01.785 [2024-11-28 12:38:31.759030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:710f0a40 cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.785 [2024-11-28 12:38:31.759057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.785 [2024-11-28 12:38:31.759113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0f0f0f0f cdw11:0f0f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.785 [2024-11-28 12:38:31.759128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.785 #54 NEW cov: 12534 ft: 15941 corp: 34/727b lim: 45 exec/s: 54 rss: 75Mb L: 21/43 MS: 1 InsertByte- 00:08:01.785 [2024-11-28 12:38:31.798813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:224e0a43 cdw11:8e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.785 [2024-11-28 12:38:31.798838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.785 #55 NEW cov: 12534 ft: 15960 corp: 35/741b lim: 45 exec/s: 27 rss: 75Mb L: 14/43 MS: 1 CopyPart- 00:08:01.785 #55 DONE cov: 12534 ft: 15960 corp: 35/741b lim: 45 exec/s: 27 rss: 75Mb 00:08:01.785 ###### Recommended dictionary. ###### 00:08:01.785 "CN\2167\373\275J\000" # Uses: 1 00:08:01.785 "\001\000" # Uses: 0 00:08:01.785 "\000\000\000\000\000\000\000\000" # Uses: 2 00:08:01.785 ###### End of recommended dictionary. ###### 00:08:01.785 Done 55 runs in 2 second(s) 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 6 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4406 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:02.045 12:38:31 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:02.045 [2024-11-28 12:38:31.990484] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:02.045 [2024-11-28 12:38:31.990560] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid608212 ] 00:08:02.304 [2024-11-28 12:38:32.311705] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:02.304 [2024-11-28 12:38:32.357355] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.304 [2024-11-28 12:38:32.377442] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.304 [2024-11-28 12:38:32.429957] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.564 [2024-11-28 12:38:32.446050] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:02.564 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.564 INFO: Seed: 4114987517 00:08:02.564 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:02.564 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:02.564 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:02.564 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.564 #2 INITED exec/s: 0 rss: 66Mb 00:08:02.564 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.564 This may also happen if the target rejected all inputs we tried so far 00:08:02.564 [2024-11-28 12:38:32.490846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0e cdw11:00000000 00:08:02.564 [2024-11-28 12:38:32.490879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.823 NEW_FUNC[1/715]: 0x469848 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:02.823 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:02.823 #4 NEW cov: 12224 ft: 12213 corp: 2/3b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 2 ChangeBit-CrossOver- 00:08:02.823 [2024-11-28 12:38:32.863141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0e cdw11:00000000 00:08:02.823 [2024-11-28 12:38:32.863188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.823 #5 NEW cov: 12337 ft: 12783 corp: 3/5b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 CrossOver- 00:08:02.823 [2024-11-28 12:38:32.933117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f5f1 cdw11:00000000 00:08:02.823 [2024-11-28 12:38:32.933144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.081 #6 NEW cov: 12343 ft: 13083 corp: 4/7b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeBinInt- 00:08:03.081 [2024-11-28 12:38:32.983551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e06 cdw11:00000000 00:08:03.081 [2024-11-28 12:38:32.983578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.081 #7 NEW cov: 12428 ft: 13288 corp: 5/9b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeBit- 00:08:03.081 [2024-11-28 12:38:33.053716] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:03.081 [2024-11-28 12:38:33.053743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.081 #8 NEW cov: 12428 ft: 13498 corp: 6/11b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeByte- 00:08:03.081 [2024-11-28 12:38:33.123779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031f5 cdw11:00000000 00:08:03.081 [2024-11-28 12:38:33.123805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.081 #9 NEW cov: 12428 ft: 13660 corp: 7/14b lim: 10 exec/s: 0 rss: 74Mb L: 3/3 MS: 1 InsertByte- 00:08:03.081 [2024-11-28 12:38:33.174613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031ff cdw11:00000000 00:08:03.081 [2024-11-28 12:38:33.174642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.081 [2024-11-28 12:38:33.174730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:03.081 [2024-11-28 12:38:33.174746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.081 [2024-11-28 12:38:33.174834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000031f5 cdw11:00000000 00:08:03.081 [2024-11-28 12:38:33.174852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.340 #10 NEW cov: 12428 ft: 13904 corp: 8/21b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 CMP- DE: "\377\377\3771"- 00:08:03.340 [2024-11-28 12:38:33.244573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.244599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.340 [2024-11-28 12:38:33.244690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.244706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.340 [2024-11-28 12:38:33.244799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.244814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.340 #11 NEW cov: 12428 ft: 13990 corp: 9/28b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:03.340 [2024-11-28 12:38:33.313969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.313996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.340 #12 NEW cov: 12428 ft: 14052 corp: 10/31b lim: 10 exec/s: 0 rss: 74Mb L: 3/7 MS: 1 CrossOver- 00:08:03.340 [2024-11-28 12:38:33.364632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.364658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.340 [2024-11-28 12:38:33.364736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:03.340 [2024-11-28 12:38:33.364751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.341 [2024-11-28 12:38:33.364836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000005b cdw11:00000000 00:08:03.341 [2024-11-28 12:38:33.364852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.341 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:03.341 #13 NEW cov: 12445 ft: 14112 corp: 11/38b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 ChangeByte- 00:08:03.341 [2024-11-28 12:38:33.434749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031ff cdw11:00000000 00:08:03.341 [2024-11-28 12:38:33.434776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.341 [2024-11-28 12:38:33.434871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:03.341 [2024-11-28 12:38:33.434886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.341 [2024-11-28 12:38:33.434981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:000031f5 cdw11:00000000 00:08:03.341 [2024-11-28 12:38:33.434997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.600 #14 NEW cov: 12445 ft: 14132 corp: 12/45b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 ShuffleBytes- 00:08:03.600 [2024-11-28 12:38:33.504855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031ff cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.504881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.504970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fff5 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.504986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.600 #15 NEW cov: 12445 ft: 14284 corp: 13/50b lim: 10 exec/s: 15 rss: 74Mb L: 5/7 MS: 1 EraseBytes- 00:08:03.600 [2024-11-28 12:38:33.554927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.554952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.600 #16 NEW cov: 12445 ft: 14334 corp: 14/52b lim: 10 exec/s: 16 rss: 74Mb L: 2/7 MS: 1 CrossOver- 00:08:03.600 [2024-11-28 12:38:33.604996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005b0a cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.605022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.600 #18 NEW cov: 12445 ft: 14352 corp: 15/54b lim: 10 exec/s: 18 rss: 74Mb L: 2/7 MS: 2 EraseBytes-InsertByte- 00:08:03.600 [2024-11-28 12:38:33.655841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003100 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.655866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.655955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.655972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.656064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff31 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.656079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.656170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000f5f1 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.656185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.600 #19 NEW cov: 12445 ft: 14570 corp: 16/62b lim: 10 exec/s: 19 rss: 74Mb L: 8/8 MS: 1 CrossOver- 00:08:03.600 [2024-11-28 12:38:33.705668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.705693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.705785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00006363 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.705799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.600 [2024-11-28 12:38:33.705902] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00006363 cdw11:00000000 00:08:03.600 [2024-11-28 12:38:33.705918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.859 #20 NEW cov: 12445 ft: 14584 corp: 17/68b lim: 10 exec/s: 20 rss: 75Mb L: 6/8 MS: 1 InsertRepeatedBytes- 00:08:03.859 [2024-11-28 12:38:33.775220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f13a cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.775246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.859 #21 NEW cov: 12445 ft: 14624 corp: 18/70b lim: 10 exec/s: 21 rss: 75Mb L: 2/8 MS: 1 ShuffleBytes- 00:08:03.859 [2024-11-28 12:38:33.825267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a06 cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.825293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.859 #22 NEW cov: 12445 ft: 14665 corp: 19/72b lim: 10 exec/s: 22 rss: 75Mb L: 2/8 MS: 1 ChangeBit- 00:08:03.859 [2024-11-28 12:38:33.895345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0e cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.895370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.859 #23 NEW cov: 12445 ft: 14684 corp: 20/74b lim: 10 exec/s: 23 rss: 75Mb L: 2/8 MS: 1 ShuffleBytes- 00:08:03.859 [2024-11-28 12:38:33.946316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003131 cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.946343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.859 [2024-11-28 12:38:33.946448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f5ff cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.946465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.859 [2024-11-28 12:38:33.946558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff00 cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.946575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.859 [2024-11-28 12:38:33.946677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000fff1 cdw11:00000000 00:08:03.859 [2024-11-28 12:38:33.946694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.119 #24 NEW cov: 12445 ft: 14689 corp: 21/82b lim: 10 exec/s: 24 rss: 75Mb L: 8/8 MS: 1 ShuffleBytes- 00:08:04.119 [2024-11-28 12:38:34.015491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.015518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.119 #25 NEW cov: 12445 ft: 14707 corp: 22/84b lim: 10 exec/s: 25 rss: 75Mb L: 2/8 MS: 1 CrossOver- 00:08:04.119 [2024-11-28 12:38:34.085905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031f5 cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.085932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.119 [2024-11-28 12:38:34.086021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f1f5 cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.086037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.119 #26 NEW cov: 12445 ft: 14724 corp: 23/89b lim: 10 exec/s: 26 rss: 75Mb L: 5/8 MS: 1 CopyPart- 00:08:04.119 [2024-11-28 12:38:34.135676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00005b0b cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.135703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.119 #27 NEW cov: 12445 ft: 14751 corp: 24/91b lim: 10 exec/s: 27 rss: 75Mb L: 2/8 MS: 1 ChangeBinInt- 00:08:04.119 [2024-11-28 12:38:34.186105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f7ff cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.186130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.119 [2024-11-28 12:38:34.186225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000fff5 cdw11:00000000 00:08:04.119 [2024-11-28 12:38:34.186240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.119 #28 NEW cov: 12445 ft: 14765 corp: 25/96b lim: 10 exec/s: 28 rss: 75Mb L: 5/8 MS: 1 ChangeByte- 00:08:04.378 [2024-11-28 12:38:34.256253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000031f1 cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.256280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.378 [2024-11-28 12:38:34.256370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f5f5 cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.256385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.378 [2024-11-28 12:38:34.256477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f1f5 cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.256493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.378 #29 NEW cov: 12445 ft: 14775 corp: 26/103b lim: 10 exec/s: 29 rss: 75Mb L: 7/8 MS: 1 CopyPart- 00:08:04.378 [2024-11-28 12:38:34.326000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.326027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.378 #30 NEW cov: 12445 ft: 14801 corp: 27/106b lim: 10 exec/s: 30 rss: 75Mb L: 3/8 MS: 1 InsertByte- 00:08:04.378 [2024-11-28 12:38:34.396216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.396243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.378 #31 NEW cov: 12452 ft: 14818 corp: 28/108b lim: 10 exec/s: 31 rss: 75Mb L: 2/8 MS: 1 ChangeBit- 00:08:04.378 [2024-11-28 12:38:34.467908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003af1 cdw11:00000000 00:08:04.378 [2024-11-28 12:38:34.467935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.378 [2024-11-28 12:38:34.468026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.379 [2024-11-28 12:38:34.468044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.379 [2024-11-28 12:38:34.468131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.379 [2024-11-28 12:38:34.468146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.379 [2024-11-28 12:38:34.468240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:04.379 [2024-11-28 12:38:34.468255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.379 [2024-11-28 12:38:34.468345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00005b00 cdw11:00000000 00:08:04.379 [2024-11-28 12:38:34.468361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.638 #32 pulse cov: 12452 ft: 14889 corp: 28/108b lim: 10 exec/s: 16 rss: 75Mb 00:08:04.638 #32 NEW cov: 12452 ft: 14889 corp: 29/118b lim: 10 exec/s: 16 rss: 75Mb L: 10/10 MS: 1 CopyPart- 00:08:04.638 #32 DONE cov: 12452 ft: 14889 corp: 29/118b lim: 10 exec/s: 16 rss: 75Mb 00:08:04.638 ###### Recommended dictionary. ###### 00:08:04.638 "\377\377\3771" # Uses: 0 00:08:04.638 ###### End of recommended dictionary. ###### 00:08:04.638 Done 32 runs in 2 second(s) 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 7 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4407 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:04.638 12:38:34 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:04.638 [2024-11-28 12:38:34.663772] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:04.639 [2024-11-28 12:38:34.663839] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid608589 ] 00:08:04.898 [2024-11-28 12:38:34.988782] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:05.165 [2024-11-28 12:38:35.036225] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.165 [2024-11-28 12:38:35.054077] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.165 [2024-11-28 12:38:35.106610] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.165 [2024-11-28 12:38:35.122724] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:05.165 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.165 INFO: Seed: 2495016458 00:08:05.165 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:05.165 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:05.165 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:05.165 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.165 #2 INITED exec/s: 0 rss: 66Mb 00:08:05.165 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.165 This may also happen if the target rejected all inputs we tried so far 00:08:05.165 [2024-11-28 12:38:35.169397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:05.165 [2024-11-28 12:38:35.169432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.433 NEW_FUNC[1/715]: 0x46a248 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:05.433 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.433 #3 NEW cov: 12224 ft: 12214 corp: 2/3b lim: 10 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 CrossOver- 00:08:05.433 [2024-11-28 12:38:35.519395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a0a cdw11:00000000 00:08:05.433 [2024-11-28 12:38:35.519438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.693 #4 NEW cov: 12337 ft: 12633 corp: 3/5b lim: 10 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeBit- 00:08:05.693 [2024-11-28 12:38:35.609406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.609438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.609491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.609508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.609535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.609550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.693 #5 NEW cov: 12343 ft: 13258 corp: 4/12b lim: 10 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:08:05.693 [2024-11-28 12:38:35.669429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.669460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.669515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.669532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.669559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.669574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.669602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.669617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.693 #6 NEW cov: 12428 ft: 13715 corp: 5/20b lim: 10 exec/s: 0 rss: 74Mb L: 8/8 MS: 1 InsertRepeatedBytes- 00:08:05.693 [2024-11-28 12:38:35.729513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.729543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.729575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000027 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.729590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.729617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.729632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.693 [2024-11-28 12:38:35.729659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.693 [2024-11-28 12:38:35.729674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.693 #7 NEW cov: 12428 ft: 13941 corp: 6/29b lim: 10 exec/s: 0 rss: 74Mb L: 9/9 MS: 1 InsertByte- 00:08:05.952 [2024-11-28 12:38:35.819409] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a0a cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.819442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.952 #8 NEW cov: 12428 ft: 14006 corp: 7/32b lim: 10 exec/s: 0 rss: 74Mb L: 3/9 MS: 1 CrossOver- 00:08:05.952 [2024-11-28 12:38:35.909407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.909439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.952 #9 NEW cov: 12428 ft: 14105 corp: 8/34b lim: 10 exec/s: 0 rss: 74Mb L: 2/9 MS: 1 CrossOver- 00:08:05.952 [2024-11-28 12:38:35.959499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.959530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.952 [2024-11-28 12:38:35.959576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.959592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.952 [2024-11-28 12:38:35.959620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.959636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.952 [2024-11-28 12:38:35.959663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:05.952 [2024-11-28 12:38:35.959678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.953 #10 NEW cov: 12428 ft: 14135 corp: 9/42b lim: 10 exec/s: 0 rss: 74Mb L: 8/9 MS: 1 ChangeBit- 00:08:05.953 [2024-11-28 12:38:36.019411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a0a cdw11:00000000 00:08:05.953 [2024-11-28 12:38:36.019442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.953 [2024-11-28 12:38:36.019496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ea0a cdw11:00000000 00:08:05.953 [2024-11-28 12:38:36.019513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.212 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:06.212 #11 NEW cov: 12445 ft: 14341 corp: 10/46b lim: 10 exec/s: 0 rss: 74Mb L: 4/9 MS: 1 InsertByte- 00:08:06.212 [2024-11-28 12:38:36.109425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e0a cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.109454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.212 #12 NEW cov: 12445 ft: 14461 corp: 11/48b lim: 10 exec/s: 0 rss: 74Mb L: 2/9 MS: 1 ChangeBit- 00:08:06.212 [2024-11-28 12:38:36.159540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.159569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.159614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000023ff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.159630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.159658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.159673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.159700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.159719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.212 #13 NEW cov: 12445 ft: 14473 corp: 12/56b lim: 10 exec/s: 13 rss: 74Mb L: 8/9 MS: 1 InsertByte- 00:08:06.212 [2024-11-28 12:38:36.249628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.249659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.249690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000023ff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.249705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.249733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ff5d cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.249748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.212 [2024-11-28 12:38:36.249775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.212 [2024-11-28 12:38:36.249790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.212 #14 NEW cov: 12445 ft: 14493 corp: 13/65b lim: 10 exec/s: 14 rss: 74Mb L: 9/9 MS: 1 InsertByte- 00:08:06.472 [2024-11-28 12:38:36.339518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002b0a cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.339548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.472 #15 NEW cov: 12445 ft: 14522 corp: 14/67b lim: 10 exec/s: 15 rss: 74Mb L: 2/9 MS: 1 ChangeByte- 00:08:06.472 [2024-11-28 12:38:36.399521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000e00 cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.399552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.472 [2024-11-28 12:38:36.399583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.399598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.472 #16 NEW cov: 12445 ft: 14555 corp: 15/72b lim: 10 exec/s: 16 rss: 74Mb L: 5/9 MS: 1 CrossOver- 00:08:06.472 [2024-11-28 12:38:36.489670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.489700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.472 [2024-11-28 12:38:36.489748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.489764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.472 [2024-11-28 12:38:36.489792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.489807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.472 [2024-11-28 12:38:36.489834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.489849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.472 #17 NEW cov: 12445 ft: 14600 corp: 16/81b lim: 10 exec/s: 17 rss: 74Mb L: 9/9 MS: 1 InsertByte- 00:08:06.472 [2024-11-28 12:38:36.579577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a4a cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.579612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.472 [2024-11-28 12:38:36.579643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ea0a cdw11:00000000 00:08:06.472 [2024-11-28 12:38:36.579658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.732 #18 NEW cov: 12445 ft: 14621 corp: 17/85b lim: 10 exec/s: 18 rss: 75Mb L: 4/9 MS: 1 ChangeBit- 00:08:06.732 [2024-11-28 12:38:36.669646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000aff cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.669677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.732 [2024-11-28 12:38:36.669708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.669723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.732 [2024-11-28 12:38:36.669750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.669764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.732 #19 NEW cov: 12445 ft: 14622 corp: 18/92b lim: 10 exec/s: 19 rss: 75Mb L: 7/9 MS: 1 ChangeByte- 00:08:06.732 [2024-11-28 12:38:36.729628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.729658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.732 [2024-11-28 12:38:36.729689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.729705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.732 #20 NEW cov: 12445 ft: 14636 corp: 19/96b lim: 10 exec/s: 20 rss: 75Mb L: 4/9 MS: 1 EraseBytes- 00:08:06.732 [2024-11-28 12:38:36.819596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000230a cdw11:00000000 00:08:06.732 [2024-11-28 12:38:36.819626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.991 #21 NEW cov: 12445 ft: 14648 corp: 20/98b lim: 10 exec/s: 21 rss: 75Mb L: 2/9 MS: 1 ChangeByte- 00:08:06.991 [2024-11-28 12:38:36.909615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000060a cdw11:00000000 00:08:06.992 [2024-11-28 12:38:36.909645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.992 #22 NEW cov: 12445 ft: 14666 corp: 21/100b lim: 10 exec/s: 22 rss: 75Mb L: 2/9 MS: 1 ChangeBit- 00:08:06.992 [2024-11-28 12:38:36.969641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001d1a cdw11:00000000 00:08:06.992 [2024-11-28 12:38:36.969672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:36.969718] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:06.992 [2024-11-28 12:38:36.969733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.992 #23 NEW cov: 12445 ft: 14695 corp: 22/104b lim: 10 exec/s: 23 rss: 75Mb L: 4/9 MS: 1 InsertByte- 00:08:06.992 [2024-11-28 12:38:37.029829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a1a cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.029867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.029900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.029916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.029945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.029961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.029989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00002e00 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.030005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.992 #24 NEW cov: 12452 ft: 14719 corp: 23/112b lim: 10 exec/s: 24 rss: 75Mb L: 8/9 MS: 1 ChangeByte- 00:08:06.992 [2024-11-28 12:38:37.089825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.089856] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.089888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.089903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.089930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.089945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.992 [2024-11-28 12:38:37.089972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000025 cdw11:00000000 00:08:06.992 [2024-11-28 12:38:37.089987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.252 #25 NEW cov: 12452 ft: 14728 corp: 24/121b lim: 10 exec/s: 25 rss: 75Mb L: 9/9 MS: 1 InsertByte- 00:08:07.252 [2024-11-28 12:38:37.149683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00002323 cdw11:00000000 00:08:07.252 [2024-11-28 12:38:37.149713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.252 #26 NEW cov: 12452 ft: 14756 corp: 25/124b lim: 10 exec/s: 13 rss: 75Mb L: 3/9 MS: 1 CopyPart- 00:08:07.252 #26 DONE cov: 12452 ft: 14756 corp: 25/124b lim: 10 exec/s: 13 rss: 75Mb 00:08:07.252 Done 26 runs in 2 second(s) 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 8 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4408 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:07.252 12:38:37 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:07.252 [2024-11-28 12:38:37.370586] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:07.252 [2024-11-28 12:38:37.370660] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid608951 ] 00:08:07.821 [2024-11-28 12:38:37.687307] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:07.821 [2024-11-28 12:38:37.732825] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.821 [2024-11-28 12:38:37.749668] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:07.821 [2024-11-28 12:38:37.802181] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:07.821 [2024-11-28 12:38:37.818286] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:07.821 INFO: Running with entropic power schedule (0xFF, 100). 00:08:07.821 INFO: Seed: 897063169 00:08:07.821 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:07.821 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:07.821 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:07.821 INFO: A corpus is not provided, starting from an empty corpus 00:08:07.821 [2024-11-28 12:38:37.873722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.821 [2024-11-28 12:38:37.873751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.821 #2 INITED cov: 12251 ft: 12250 corp: 1/1b exec/s: 0 rss: 72Mb 00:08:07.821 [2024-11-28 12:38:37.913810] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.821 [2024-11-28 12:38:37.913835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.821 [2024-11-28 12:38:37.913910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:07.821 [2024-11-28 12:38:37.913925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.080 #3 NEW cov: 12364 ft: 13477 corp: 2/3b lim: 5 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:08.080 [2024-11-28 12:38:37.973653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:37.973678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 #4 NEW cov: 12370 ft: 13800 corp: 3/4b lim: 5 exec/s: 0 rss: 72Mb L: 1/2 MS: 1 CrossOver- 00:08:08.080 [2024-11-28 12:38:38.013661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.013685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 #5 NEW cov: 12455 ft: 14108 corp: 4/5b lim: 5 exec/s: 0 rss: 72Mb L: 1/2 MS: 1 ShuffleBytes- 00:08:08.080 [2024-11-28 12:38:38.053713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.053741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 #6 NEW cov: 12455 ft: 14228 corp: 5/6b lim: 5 exec/s: 0 rss: 72Mb L: 1/2 MS: 1 ChangeBinInt- 00:08:08.080 [2024-11-28 12:38:38.113928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.113955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 [2024-11-28 12:38:38.114016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.114030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.080 #7 NEW cov: 12455 ft: 14287 corp: 6/8b lim: 5 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 InsertByte- 00:08:08.080 [2024-11-28 12:38:38.153925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.153949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 [2024-11-28 12:38:38.154010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.080 [2024-11-28 12:38:38.154024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.080 #8 NEW cov: 12455 ft: 14353 corp: 7/10b lim: 5 exec/s: 0 rss: 72Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:08.339 [2024-11-28 12:38:38.213966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.339 [2024-11-28 12:38:38.213991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.339 [2024-11-28 12:38:38.214050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.339 [2024-11-28 12:38:38.214063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.339 #9 NEW cov: 12455 ft: 14404 corp: 8/12b lim: 5 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 InsertByte- 00:08:08.339 [2024-11-28 12:38:38.274477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.339 [2024-11-28 12:38:38.274502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.340 [2024-11-28 12:38:38.274577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.274592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.340 [2024-11-28 12:38:38.274660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.274675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.340 [2024-11-28 12:38:38.274734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.274748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.340 [2024-11-28 12:38:38.274806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.274819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:08.340 #10 NEW cov: 12455 ft: 14874 corp: 9/17b lim: 5 exec/s: 0 rss: 73Mb L: 5/5 MS: 1 CMP- DE: "\001\000\000\010"- 00:08:08.340 [2024-11-28 12:38:38.333846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.333872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.340 #11 NEW cov: 12455 ft: 14916 corp: 10/18b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:08.340 [2024-11-28 12:38:38.373984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.374009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.340 [2024-11-28 12:38:38.374083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.374099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.340 #12 NEW cov: 12455 ft: 14935 corp: 11/20b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:08.340 [2024-11-28 12:38:38.433888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.340 [2024-11-28 12:38:38.433914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.340 #13 NEW cov: 12455 ft: 14946 corp: 12/21b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 CopyPart- 00:08:08.599 [2024-11-28 12:38:38.473900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.473925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.599 #14 NEW cov: 12455 ft: 15028 corp: 13/22b lim: 5 exec/s: 0 rss: 73Mb L: 1/5 MS: 1 ChangeByte- 00:08:08.599 [2024-11-28 12:38:38.514404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.514428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.514504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.514519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.514580] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.514594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.514653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.514667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.599 #15 NEW cov: 12455 ft: 15044 corp: 14/26b lim: 5 exec/s: 0 rss: 73Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:08:08.599 [2024-11-28 12:38:38.554083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.554108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.554182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.554196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.599 #16 NEW cov: 12455 ft: 15100 corp: 15/28b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 CopyPart- 00:08:08.599 [2024-11-28 12:38:38.614107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.614132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.614193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.614207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.599 #17 NEW cov: 12455 ft: 15125 corp: 16/30b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 InsertByte- 00:08:08.599 [2024-11-28 12:38:38.654194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.654220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.599 [2024-11-28 12:38:38.654279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.654293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.599 #18 NEW cov: 12455 ft: 15192 corp: 17/32b lim: 5 exec/s: 0 rss: 73Mb L: 2/5 MS: 1 ChangeBit- 00:08:08.599 [2024-11-28 12:38:38.694204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.599 [2024-11-28 12:38:38.694229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.600 [2024-11-28 12:38:38.694288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.600 [2024-11-28 12:38:38.694302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.860 #19 NEW cov: 12455 ft: 15219 corp: 18/34b lim: 5 exec/s: 0 rss: 74Mb L: 2/5 MS: 1 EraseBytes- 00:08:08.860 [2024-11-28 12:38:38.754422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.860 [2024-11-28 12:38:38.754453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.860 [2024-11-28 12:38:38.754536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.860 [2024-11-28 12:38:38.754551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.860 [2024-11-28 12:38:38.754607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.860 [2024-11-28 12:38:38.754631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.119 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:09.119 #20 NEW cov: 12478 ft: 15424 corp: 19/37b lim: 5 exec/s: 20 rss: 75Mb L: 3/5 MS: 1 InsertByte- 00:08:09.119 [2024-11-28 12:38:39.074311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.074345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.119 [2024-11-28 12:38:39.074418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.074432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.119 #21 NEW cov: 12478 ft: 15456 corp: 20/39b lim: 5 exec/s: 21 rss: 75Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:09.119 [2024-11-28 12:38:39.114209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.114235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.119 [2024-11-28 12:38:39.114290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.114303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.119 #22 NEW cov: 12478 ft: 15488 corp: 21/41b lim: 5 exec/s: 22 rss: 75Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:09.119 [2024-11-28 12:38:39.174072] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.174098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.119 #23 NEW cov: 12478 ft: 15501 corp: 22/42b lim: 5 exec/s: 23 rss: 75Mb L: 1/5 MS: 1 ChangeByte- 00:08:09.119 [2024-11-28 12:38:39.234066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.119 [2024-11-28 12:38:39.234091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.378 #24 NEW cov: 12478 ft: 15514 corp: 23/43b lim: 5 exec/s: 24 rss: 75Mb L: 1/5 MS: 1 ShuffleBytes- 00:08:09.378 [2024-11-28 12:38:39.274377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.274401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.378 [2024-11-28 12:38:39.274483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.274497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.378 [2024-11-28 12:38:39.274561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.274575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.378 #25 NEW cov: 12478 ft: 15546 corp: 24/46b lim: 5 exec/s: 25 rss: 75Mb L: 3/5 MS: 1 CrossOver- 00:08:09.378 [2024-11-28 12:38:39.314253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.314277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.378 [2024-11-28 12:38:39.314334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.314348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.378 #26 NEW cov: 12478 ft: 15598 corp: 25/48b lim: 5 exec/s: 26 rss: 75Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:09.378 [2024-11-28 12:38:39.374095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.378 [2024-11-28 12:38:39.374119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.378 #27 NEW cov: 12478 ft: 15622 corp: 26/49b lim: 5 exec/s: 27 rss: 75Mb L: 1/5 MS: 1 CopyPart- 00:08:09.379 [2024-11-28 12:38:39.434111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.379 [2024-11-28 12:38:39.434135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 #28 NEW cov: 12478 ft: 15633 corp: 27/50b lim: 5 exec/s: 28 rss: 75Mb L: 1/5 MS: 1 CrossOver- 00:08:09.379 [2024-11-28 12:38:39.494148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.379 [2024-11-28 12:38:39.494173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.637 #29 NEW cov: 12478 ft: 15643 corp: 28/51b lim: 5 exec/s: 29 rss: 75Mb L: 1/5 MS: 1 ChangeByte- 00:08:09.637 [2024-11-28 12:38:39.554180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.637 [2024-11-28 12:38:39.554205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.637 #30 NEW cov: 12478 ft: 15650 corp: 29/52b lim: 5 exec/s: 30 rss: 75Mb L: 1/5 MS: 1 EraseBytes- 00:08:09.637 [2024-11-28 12:38:39.614200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.638 [2024-11-28 12:38:39.614225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.638 #31 NEW cov: 12478 ft: 15676 corp: 30/53b lim: 5 exec/s: 31 rss: 75Mb L: 1/5 MS: 1 CopyPart- 00:08:09.638 [2024-11-28 12:38:39.674525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.638 [2024-11-28 12:38:39.674551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.638 [2024-11-28 12:38:39.674638] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.638 [2024-11-28 12:38:39.674651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.638 [2024-11-28 12:38:39.674720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.638 [2024-11-28 12:38:39.674734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.638 #32 NEW cov: 12478 ft: 15706 corp: 31/56b lim: 5 exec/s: 32 rss: 75Mb L: 3/5 MS: 1 ShuffleBytes- 00:08:09.638 [2024-11-28 12:38:39.734281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.638 [2024-11-28 12:38:39.734306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.638 #33 NEW cov: 12478 ft: 15714 corp: 32/57b lim: 5 exec/s: 33 rss: 75Mb L: 1/5 MS: 1 EraseBytes- 00:08:09.898 [2024-11-28 12:38:39.774441] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.898 [2024-11-28 12:38:39.774466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.898 [2024-11-28 12:38:39.774546] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.898 [2024-11-28 12:38:39.774561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.898 #34 NEW cov: 12478 ft: 15728 corp: 33/59b lim: 5 exec/s: 34 rss: 75Mb L: 2/5 MS: 1 InsertByte- 00:08:09.898 [2024-11-28 12:38:39.814451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.898 [2024-11-28 12:38:39.814481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.898 [2024-11-28 12:38:39.814554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.898 [2024-11-28 12:38:39.814568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.898 #35 NEW cov: 12478 ft: 15741 corp: 34/61b lim: 5 exec/s: 17 rss: 75Mb L: 2/5 MS: 1 CrossOver- 00:08:09.898 #35 DONE cov: 12478 ft: 15741 corp: 34/61b lim: 5 exec/s: 17 rss: 75Mb 00:08:09.898 ###### Recommended dictionary. ###### 00:08:09.898 "\001\000\000\010" # Uses: 0 00:08:09.898 ###### End of recommended dictionary. ###### 00:08:09.898 Done 35 runs in 2 second(s) 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 9 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4409 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:09.898 12:38:39 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:09.898 [2024-11-28 12:38:40.005905] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:09.898 [2024-11-28 12:38:40.005988] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid609294 ] 00:08:10.467 [2024-11-28 12:38:40.339149] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:10.467 [2024-11-28 12:38:40.386203] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.467 [2024-11-28 12:38:40.407792] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.467 [2024-11-28 12:38:40.460759] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.467 [2024-11-28 12:38:40.476862] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:10.467 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.467 INFO: Seed: 3554061427 00:08:10.467 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:10.467 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:10.467 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:10.467 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.467 [2024-11-28 12:38:40.547296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.467 [2024-11-28 12:38:40.547341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.467 #2 INITED cov: 12229 ft: 12250 corp: 1/1b exec/s: 0 rss: 72Mb 00:08:10.726 [2024-11-28 12:38:40.597486] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.726 [2024-11-28 12:38:40.597518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.985 NEW_FUNC[1/1]: 0x1fbe6a8 in thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1080 00:08:10.985 #3 NEW cov: 12364 ft: 12767 corp: 2/2b lim: 5 exec/s: 0 rss: 73Mb L: 1/1 MS: 1 CopyPart- 00:08:10.985 [2024-11-28 12:38:40.928277] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:40.928314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.985 #4 NEW cov: 12370 ft: 13232 corp: 3/3b lim: 5 exec/s: 0 rss: 73Mb L: 1/1 MS: 1 ChangeBit- 00:08:10.985 [2024-11-28 12:38:40.978874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:40.978913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.985 [2024-11-28 12:38:40.978996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:40.979012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.985 #5 NEW cov: 12455 ft: 14265 corp: 4/5b lim: 5 exec/s: 0 rss: 73Mb L: 2/2 MS: 1 CrossOver- 00:08:10.985 [2024-11-28 12:38:41.029017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:41.029044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.985 [2024-11-28 12:38:41.029134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:41.029150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.985 #6 NEW cov: 12455 ft: 14369 corp: 5/7b lim: 5 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ShuffleBytes- 00:08:10.985 [2024-11-28 12:38:41.099162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:41.099190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.985 [2024-11-28 12:38:41.099278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.985 [2024-11-28 12:38:41.099295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.244 #7 NEW cov: 12455 ft: 14512 corp: 6/9b lim: 5 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeByte- 00:08:11.244 [2024-11-28 12:38:41.169169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.169196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.244 [2024-11-28 12:38:41.169285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.169302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.244 #8 NEW cov: 12455 ft: 14553 corp: 7/11b lim: 5 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 CopyPart- 00:08:11.244 [2024-11-28 12:38:41.239584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.239612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.244 [2024-11-28 12:38:41.239703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.239719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.244 #9 NEW cov: 12455 ft: 14589 corp: 8/13b lim: 5 exec/s: 0 rss: 74Mb L: 2/2 MS: 1 ChangeBinInt- 00:08:11.244 [2024-11-28 12:38:41.290579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.290619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.244 [2024-11-28 12:38:41.290721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.290737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.244 [2024-11-28 12:38:41.290827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.244 [2024-11-28 12:38:41.290843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.244 [2024-11-28 12:38:41.290928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.245 [2024-11-28 12:38:41.290945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.245 #10 NEW cov: 12455 ft: 14956 corp: 9/17b lim: 5 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:11.245 [2024-11-28 12:38:41.350708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.245 [2024-11-28 12:38:41.350738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.245 [2024-11-28 12:38:41.350851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.245 [2024-11-28 12:38:41.350866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.245 [2024-11-28 12:38:41.350947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.245 [2024-11-28 12:38:41.350962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.245 [2024-11-28 12:38:41.351063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.245 [2024-11-28 12:38:41.351079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.504 #11 NEW cov: 12455 ft: 14993 corp: 10/21b lim: 5 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:11.504 [2024-11-28 12:38:41.399630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.399657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:11.504 #12 NEW cov: 12478 ft: 15080 corp: 11/22b lim: 5 exec/s: 0 rss: 74Mb L: 1/4 MS: 1 ChangeByte- 00:08:11.504 [2024-11-28 12:38:41.470762] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.470787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 [2024-11-28 12:38:41.470880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.470899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.504 [2024-11-28 12:38:41.470983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.470998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.504 [2024-11-28 12:38:41.471091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.471105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.504 #13 NEW cov: 12478 ft: 15107 corp: 12/26b lim: 5 exec/s: 0 rss: 74Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:11.504 [2024-11-28 12:38:41.540065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.540089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.504 [2024-11-28 12:38:41.540181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.540196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.504 #14 NEW cov: 12478 ft: 15172 corp: 13/28b lim: 5 exec/s: 14 rss: 74Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:11.504 [2024-11-28 12:38:41.609777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.504 [2024-11-28 12:38:41.609802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 #15 NEW cov: 12478 ft: 15192 corp: 14/29b lim: 5 exec/s: 15 rss: 74Mb L: 1/4 MS: 1 EraseBytes- 00:08:11.764 [2024-11-28 12:38:41.660484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.660509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.660602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.660618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 #16 NEW cov: 12478 ft: 15216 corp: 15/31b lim: 5 exec/s: 16 rss: 74Mb L: 2/4 MS: 1 CopyPart- 00:08:11.764 [2024-11-28 12:38:41.711022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.711046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.711140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.711155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.711242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.711259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.764 #17 NEW cov: 12478 ft: 15394 corp: 16/34b lim: 5 exec/s: 17 rss: 74Mb L: 3/4 MS: 1 CrossOver- 00:08:11.764 [2024-11-28 12:38:41.761029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.761054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.761150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.761166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.761253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.761268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.764 #18 NEW cov: 12478 ft: 15419 corp: 17/37b lim: 5 exec/s: 18 rss: 74Mb L: 3/4 MS: 1 CopyPart- 00:08:11.764 [2024-11-28 12:38:41.830756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.830781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.764 [2024-11-28 12:38:41.830866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.764 [2024-11-28 12:38:41.830881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.764 #19 NEW cov: 12478 ft: 15434 corp: 18/39b lim: 5 exec/s: 19 rss: 74Mb L: 2/4 MS: 1 ChangeASCIIInt- 00:08:12.024 [2024-11-28 12:38:41.902114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.902141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:41.902236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.902252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:41.902346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.902362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:41.902449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.902463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:41.902565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.902581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.024 #20 NEW cov: 12478 ft: 15502 corp: 19/44b lim: 5 exec/s: 20 rss: 74Mb L: 5/5 MS: 1 CrossOver- 00:08:12.024 [2024-11-28 12:38:41.970590] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:41.970618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.024 #21 NEW cov: 12478 ft: 15510 corp: 20/45b lim: 5 exec/s: 21 rss: 74Mb L: 1/5 MS: 1 ChangeBit- 00:08:12.024 [2024-11-28 12:38:42.021127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.021152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:42.021233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.021248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.024 #22 NEW cov: 12478 ft: 15554 corp: 21/47b lim: 5 exec/s: 22 rss: 74Mb L: 2/5 MS: 1 ChangeASCIIInt- 00:08:12.024 [2024-11-28 12:38:42.091298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.091323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:42.091417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.091431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.024 #23 NEW cov: 12478 ft: 15561 corp: 22/49b lim: 5 exec/s: 23 rss: 74Mb L: 2/5 MS: 1 CopyPart- 00:08:12.024 [2024-11-28 12:38:42.142438] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.142464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:42.142567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.024 [2024-11-28 12:38:42.142584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.024 [2024-11-28 12:38:42.142668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.025 [2024-11-28 12:38:42.142684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.025 [2024-11-28 12:38:42.142774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.025 [2024-11-28 12:38:42.142791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.025 [2024-11-28 12:38:42.142875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.025 [2024-11-28 12:38:42.142890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.283 #24 NEW cov: 12478 ft: 15570 corp: 23/54b lim: 5 exec/s: 24 rss: 74Mb L: 5/5 MS: 1 CopyPart- 00:08:12.283 [2024-11-28 12:38:42.211706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.211731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.284 [2024-11-28 12:38:42.211830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.211845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.284 #25 NEW cov: 12478 ft: 15584 corp: 24/56b lim: 5 exec/s: 25 rss: 75Mb L: 2/5 MS: 1 ShuffleBytes- 00:08:12.284 [2024-11-28 12:38:42.282234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.282259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.284 [2024-11-28 12:38:42.282352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.282368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.284 [2024-11-28 12:38:42.282452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.282467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.284 #26 NEW cov: 12478 ft: 15589 corp: 25/59b lim: 5 exec/s: 26 rss: 75Mb L: 3/5 MS: 1 EraseBytes- 00:08:12.284 [2024-11-28 12:38:42.331648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.331673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.284 [2024-11-28 12:38:42.401776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.284 [2024-11-28 12:38:42.401803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.543 #28 NEW cov: 12478 ft: 15597 corp: 26/60b lim: 5 exec/s: 28 rss: 75Mb L: 1/5 MS: 2 EraseBytes-ShuffleBytes- 00:08:12.543 [2024-11-28 12:38:42.452255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.543 [2024-11-28 12:38:42.452285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.543 [2024-11-28 12:38:42.452377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.543 [2024-11-28 12:38:42.452394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.543 #29 NEW cov: 12478 ft: 15646 corp: 27/62b lim: 5 exec/s: 29 rss: 75Mb L: 2/5 MS: 1 InsertByte- 00:08:12.543 [2024-11-28 12:38:42.502635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.543 [2024-11-28 12:38:42.502665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.543 [2024-11-28 12:38:42.502759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.543 [2024-11-28 12:38:42.502775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.543 [2024-11-28 12:38:42.502868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.543 [2024-11-28 12:38:42.502887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.543 #30 NEW cov: 12478 ft: 15706 corp: 28/65b lim: 5 exec/s: 15 rss: 75Mb L: 3/5 MS: 1 InsertByte- 00:08:12.543 #30 DONE cov: 12478 ft: 15706 corp: 28/65b lim: 5 exec/s: 15 rss: 75Mb 00:08:12.543 Done 30 runs in 2 second(s) 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 10 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4410 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:12.543 12:38:42 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:12.802 [2024-11-28 12:38:42.683721] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:12.802 [2024-11-28 12:38:42.683793] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid609669 ] 00:08:13.061 [2024-11-28 12:38:43.005020] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:13.061 [2024-11-28 12:38:43.050502] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.061 [2024-11-28 12:38:43.071755] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.061 [2024-11-28 12:38:43.124262] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.061 [2024-11-28 12:38:43.140368] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:13.061 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.061 INFO: Seed: 1924098871 00:08:13.061 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:13.061 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:13.061 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:13.061 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.061 #2 INITED exec/s: 0 rss: 66Mb 00:08:13.061 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.061 This may also happen if the target rejected all inputs we tried so far 00:08:13.319 [2024-11-28 12:38:43.196127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.319 [2024-11-28 12:38:43.196155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.319 [2024-11-28 12:38:43.196213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.319 [2024-11-28 12:38:43.196227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.319 [2024-11-28 12:38:43.196287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.319 [2024-11-28 12:38:43.196300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.319 [2024-11-28 12:38:43.196355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.319 [2024-11-28 12:38:43.196368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.578 NEW_FUNC[1/715]: 0x46bbc8 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:13.578 NEW_FUNC[2/715]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.578 #5 NEW cov: 12271 ft: 12270 corp: 2/35b lim: 40 exec/s: 0 rss: 73Mb L: 34/34 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:13.578 [2024-11-28 12:38:43.516073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.516114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.578 [2024-11-28 12:38:43.516183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.516200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.578 NEW_FUNC[1/1]: 0x1980e68 in nvme_qpair_is_admin_queue /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1190 00:08:13.578 #6 NEW cov: 12387 ft: 13503 corp: 3/55b lim: 40 exec/s: 0 rss: 73Mb L: 20/34 MS: 1 InsertRepeatedBytes- 00:08:13.578 [2024-11-28 12:38:43.566212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.566238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.578 [2024-11-28 12:38:43.566302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.566316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.578 [2024-11-28 12:38:43.566377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.566391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.578 [2024-11-28 12:38:43.566454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.566476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.578 #7 NEW cov: 12393 ft: 13782 corp: 4/89b lim: 40 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 CrossOver- 00:08:13.578 [2024-11-28 12:38:43.625995] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a0a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.626021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.578 [2024-11-28 12:38:43.626100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.578 [2024-11-28 12:38:43.626115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.578 #8 NEW cov: 12478 ft: 14018 corp: 5/109b lim: 40 exec/s: 0 rss: 73Mb L: 20/34 MS: 1 ChangeBinInt- 00:08:13.579 [2024-11-28 12:38:43.686032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a20014a2 cdw11:a0a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.579 [2024-11-28 12:38:43.686058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.579 [2024-11-28 12:38:43.686124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.579 [2024-11-28 12:38:43.686139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.838 #9 NEW cov: 12478 ft: 14183 corp: 6/129b lim: 40 exec/s: 0 rss: 74Mb L: 20/34 MS: 1 ChangeBinInt- 00:08:13.838 [2024-11-28 12:38:43.746272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.746297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.746373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.746389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.746450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.746464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.746535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000022 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.746549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.838 #10 NEW cov: 12478 ft: 14279 corp: 7/163b lim: 40 exec/s: 0 rss: 74Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:13.838 [2024-11-28 12:38:43.805910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.805937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.838 #11 NEW cov: 12478 ft: 14599 corp: 8/177b lim: 40 exec/s: 0 rss: 74Mb L: 14/34 MS: 1 EraseBytes- 00:08:13.838 [2024-11-28 12:38:43.846047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.846076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.846141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a0a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.846155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.838 #12 NEW cov: 12478 ft: 14629 corp: 9/197b lim: 40 exec/s: 0 rss: 74Mb L: 20/34 MS: 1 ShuffleBytes- 00:08:13.838 [2024-11-28 12:38:43.886393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.886418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.886487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.886502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.886563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.886577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.886639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.886652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.838 #13 NEW cov: 12478 ft: 14766 corp: 10/231b lim: 40 exec/s: 0 rss: 74Mb L: 34/34 MS: 1 ChangeByte- 00:08:13.838 [2024-11-28 12:38:43.926228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.926254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.926332] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.926347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.838 [2024-11-28 12:38:43.926408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:13.838 [2024-11-28 12:38:43.926423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.838 #14 NEW cov: 12478 ft: 15010 corp: 11/262b lim: 40 exec/s: 0 rss: 74Mb L: 31/34 MS: 1 EraseBytes- 00:08:14.098 [2024-11-28 12:38:43.966170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a20014a2 cdw11:a0a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:43.966196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:43.966259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:43.966272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 #15 NEW cov: 12478 ft: 15096 corp: 12/283b lim: 40 exec/s: 0 rss: 74Mb L: 21/34 MS: 1 InsertByte- 00:08:14.098 [2024-11-28 12:38:44.026428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.026452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.026535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.026550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.026615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.026628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.026688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.026702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.098 #16 NEW cov: 12478 ft: 15137 corp: 13/318b lim: 40 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:14.098 [2024-11-28 12:38:44.086314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.086340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.086404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.086417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.086481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.086510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.098 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:14.098 #17 NEW cov: 12501 ft: 15199 corp: 14/347b lim: 40 exec/s: 0 rss: 74Mb L: 29/35 MS: 1 EraseBytes- 00:08:14.098 [2024-11-28 12:38:44.146343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.146368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.146430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.146444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.146506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0060 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.146519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.098 #18 NEW cov: 12501 ft: 15244 corp: 15/377b lim: 40 exec/s: 18 rss: 74Mb L: 30/35 MS: 1 InsertByte- 00:08:14.098 [2024-11-28 12:38:44.206369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.206397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.206481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.206496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.098 [2024-11-28 12:38:44.206558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.098 [2024-11-28 12:38:44.206572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.357 #19 NEW cov: 12501 ft: 15256 corp: 16/408b lim: 40 exec/s: 19 rss: 74Mb L: 31/35 MS: 1 ShuffleBytes- 00:08:14.357 [2024-11-28 12:38:44.266234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.266259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.266321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.266335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.357 #20 NEW cov: 12501 ft: 15269 corp: 17/427b lim: 40 exec/s: 20 rss: 74Mb L: 19/35 MS: 1 CrossOver- 00:08:14.357 [2024-11-28 12:38:44.306456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a20014a2 cdw11:a0a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.306485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.306566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:9a9a9a9a cdw11:9a9a9a9a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.306581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.306643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:9a9aa2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.306656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.357 #21 NEW cov: 12501 ft: 15287 corp: 18/458b lim: 40 exec/s: 21 rss: 74Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:08:14.357 [2024-11-28 12:38:44.366219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.366244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.357 #22 NEW cov: 12501 ft: 15366 corp: 19/471b lim: 40 exec/s: 22 rss: 74Mb L: 13/35 MS: 1 CrossOver- 00:08:14.357 [2024-11-28 12:38:44.426634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002d00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.426660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.426738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.426752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.426816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.426830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.357 [2024-11-28 12:38:44.426893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.357 [2024-11-28 12:38:44.426907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.357 #23 NEW cov: 12501 ft: 15370 corp: 20/503b lim: 40 exec/s: 23 rss: 74Mb L: 32/35 MS: 1 InsertByte- 00:08:14.617 [2024-11-28 12:38:44.486558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.486583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.486647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.486661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.486725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.486740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.617 #24 NEW cov: 12501 ft: 15399 corp: 21/532b lim: 40 exec/s: 24 rss: 74Mb L: 29/35 MS: 1 ShuffleBytes- 00:08:14.617 [2024-11-28 12:38:44.526278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.526302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 #25 NEW cov: 12501 ft: 15423 corp: 22/544b lim: 40 exec/s: 25 rss: 74Mb L: 12/35 MS: 1 EraseBytes- 00:08:14.617 [2024-11-28 12:38:44.566586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.566611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.566690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.566705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.566767] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.566781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.617 #26 NEW cov: 12501 ft: 15437 corp: 23/573b lim: 40 exec/s: 26 rss: 74Mb L: 29/35 MS: 1 ChangeBinInt- 00:08:14.617 [2024-11-28 12:38:44.606333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.606357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 #32 NEW cov: 12501 ft: 15486 corp: 24/585b lim: 40 exec/s: 32 rss: 74Mb L: 12/35 MS: 1 CopyPart- 00:08:14.617 [2024-11-28 12:38:44.666771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.666799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.666883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.666897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.666958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.666972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.667035] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00929292 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.667048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.617 #33 NEW cov: 12501 ft: 15535 corp: 25/620b lim: 40 exec/s: 33 rss: 74Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:14.617 [2024-11-28 12:38:44.706862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002d00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.706887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.706966] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.706981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.707041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.707055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.617 [2024-11-28 12:38:44.707115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.617 [2024-11-28 12:38:44.707129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.877 #34 NEW cov: 12501 ft: 15566 corp: 26/652b lim: 40 exec/s: 34 rss: 75Mb L: 32/35 MS: 1 ShuffleBytes- 00:08:14.877 [2024-11-28 12:38:44.766850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.766876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.766938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.766951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.767016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.767030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.767093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.767109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.877 #35 NEW cov: 12501 ft: 15579 corp: 27/686b lim: 40 exec/s: 35 rss: 75Mb L: 34/35 MS: 1 CopyPart- 00:08:14.877 [2024-11-28 12:38:44.806636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.806664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.806729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.806743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.806807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.806825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.877 #36 NEW cov: 12501 ft: 15643 corp: 28/711b lim: 40 exec/s: 36 rss: 75Mb L: 25/35 MS: 1 EraseBytes- 00:08:14.877 [2024-11-28 12:38:44.846582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.846607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.846699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.846713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.877 #39 NEW cov: 12501 ft: 15664 corp: 29/727b lim: 40 exec/s: 39 rss: 75Mb L: 16/35 MS: 3 CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:14.877 [2024-11-28 12:38:44.886881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00001717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.886905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.886988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:17000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.887003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.887064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.887078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.887141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:7e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.887155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.877 #40 NEW cov: 12501 ft: 15693 corp: 30/764b lim: 40 exec/s: 40 rss: 75Mb L: 37/37 MS: 1 InsertRepeatedBytes- 00:08:14.877 [2024-11-28 12:38:44.926449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.926480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 #41 NEW cov: 12501 ft: 15710 corp: 31/777b lim: 40 exec/s: 41 rss: 75Mb L: 13/37 MS: 1 ChangeByte- 00:08:14.877 [2024-11-28 12:38:44.986802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.986827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.986893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.986907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.877 [2024-11-28 12:38:44.986967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:007e0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:14.877 [2024-11-28 12:38:44.986981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.140 #42 NEW cov: 12501 ft: 15738 corp: 32/806b lim: 40 exec/s: 42 rss: 75Mb L: 29/37 MS: 1 ChangeBinInt- 00:08:15.140 [2024-11-28 12:38:45.047007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a20014a2 cdw11:a0a2a2a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.047034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.140 [2024-11-28 12:38:45.047099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a2a2a2a2 cdw11:a2a2a2ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.047114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.140 [2024-11-28 12:38:45.047176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.047190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.140 [2024-11-28 12:38:45.047253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:ffffffa2 cdw11:a2a2a20a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.047267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.140 #43 NEW cov: 12501 ft: 15743 corp: 33/838b lim: 40 exec/s: 43 rss: 75Mb L: 32/37 MS: 1 InsertRepeatedBytes- 00:08:15.140 [2024-11-28 12:38:45.086725] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.086750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.140 [2024-11-28 12:38:45.086812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.140 [2024-11-28 12:38:45.086826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.140 #44 NEW cov: 12501 ft: 15787 corp: 34/859b lim: 40 exec/s: 44 rss: 75Mb L: 21/37 MS: 1 CrossOver- 00:08:15.141 [2024-11-28 12:38:45.147015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:000017e2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.141 [2024-11-28 12:38:45.147040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.141 [2024-11-28 12:38:45.147122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:17000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.141 [2024-11-28 12:38:45.147139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.141 [2024-11-28 12:38:45.147202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.141 [2024-11-28 12:38:45.147215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.141 [2024-11-28 12:38:45.147278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:7e000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:15.141 [2024-11-28 12:38:45.147292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.141 #45 NEW cov: 12501 ft: 15793 corp: 35/896b lim: 40 exec/s: 22 rss: 75Mb L: 37/37 MS: 1 ChangeBinInt- 00:08:15.141 #45 DONE cov: 12501 ft: 15793 corp: 35/896b lim: 40 exec/s: 22 rss: 75Mb 00:08:15.141 Done 45 runs in 2 second(s) 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 11 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4411 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:15.409 12:38:45 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:15.409 [2024-11-28 12:38:45.345947] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:15.409 [2024-11-28 12:38:45.346022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid610028 ] 00:08:15.667 [2024-11-28 12:38:45.665046] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:15.667 [2024-11-28 12:38:45.712053] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.667 [2024-11-28 12:38:45.729171] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.668 [2024-11-28 12:38:45.781753] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.927 [2024-11-28 12:38:45.797846] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:15.927 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.927 INFO: Seed: 286118710 00:08:15.927 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:15.927 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:15.927 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:15.927 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.927 #2 INITED exec/s: 0 rss: 66Mb 00:08:15.927 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.927 This may also happen if the target rejected all inputs we tried so far 00:08:15.927 [2024-11-28 12:38:45.853717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cbff0700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.927 [2024-11-28 12:38:45.853745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.927 [2024-11-28 12:38:45.853801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.927 [2024-11-28 12:38:45.853815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.927 [2024-11-28 12:38:45.853869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.927 [2024-11-28 12:38:45.853882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.927 [2024-11-28 12:38:45.853937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.927 [2024-11-28 12:38:45.853950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.187 NEW_FUNC[1/714]: 0x46d938 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:16.187 NEW_FUNC[2/714]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:16.187 #5 NEW cov: 12249 ft: 12285 corp: 2/38b lim: 40 exec/s: 0 rss: 73Mb L: 37/37 MS: 3 InsertRepeatedBytes-CMP-InsertRepeatedBytes- DE: "\377\007"- 00:08:16.187 [2024-11-28 12:38:46.173607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.173642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.173696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.173710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.173766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.173780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.187 NEW_FUNC[1/3]: 0x1c613f8 in event_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:595 00:08:16.187 NEW_FUNC[2/3]: 0x1c62a48 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:932 00:08:16.187 #6 NEW cov: 12399 ft: 13234 corp: 3/63b lim: 40 exec/s: 0 rss: 74Mb L: 25/37 MS: 1 CrossOver- 00:08:16.187 [2024-11-28 12:38:46.213676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.213703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.213761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.213775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.213828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.213841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.213897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.213910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.187 #11 NEW cov: 12405 ft: 13486 corp: 4/101b lim: 40 exec/s: 0 rss: 74Mb L: 38/38 MS: 5 ChangeBit-PersAutoDict-EraseBytes-ChangeBit-InsertRepeatedBytes- DE: "\377\007"- 00:08:16.187 [2024-11-28 12:38:46.253229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.253256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.187 #13 NEW cov: 12490 ft: 14416 corp: 5/115b lim: 40 exec/s: 0 rss: 74Mb L: 14/38 MS: 2 ChangeBinInt-CrossOver- 00:08:16.187 [2024-11-28 12:38:46.293712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.293738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.293811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.293825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.293880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.293893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.187 [2024-11-28 12:38:46.293948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.187 [2024-11-28 12:38:46.293961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.447 #14 NEW cov: 12490 ft: 14577 corp: 6/154b lim: 40 exec/s: 0 rss: 74Mb L: 39/39 MS: 1 CopyPart- 00:08:16.447 [2024-11-28 12:38:46.353786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.353813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.353868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.353886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.353941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.353954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.354009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:292929ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.354022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.447 #15 NEW cov: 12490 ft: 14681 corp: 7/193b lim: 40 exec/s: 0 rss: 74Mb L: 39/39 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:16.447 [2024-11-28 12:38:46.413768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cbff0700 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.413793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.413849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.413862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.413918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:ff070000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.413931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.413986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.413999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.447 #16 NEW cov: 12490 ft: 14754 corp: 8/230b lim: 40 exec/s: 0 rss: 74Mb L: 37/39 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:16.447 [2024-11-28 12:38:46.473594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4f40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.473619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.473672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.473686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.473757] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.473771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.447 #17 NEW cov: 12490 ft: 14764 corp: 9/258b lim: 40 exec/s: 0 rss: 74Mb L: 28/39 MS: 1 CopyPart- 00:08:16.447 [2024-11-28 12:38:46.533801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:41414141 cdw11:41414141 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.533827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.533883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.533900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.447 [2024-11-28 12:38:46.533956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.447 [2024-11-28 12:38:46.533969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.448 [2024-11-28 12:38:46.534024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.448 [2024-11-28 12:38:46.534037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.708 #18 NEW cov: 12490 ft: 14808 corp: 10/291b lim: 40 exec/s: 0 rss: 74Mb L: 33/39 MS: 1 InsertRepeatedBytes- 00:08:16.708 [2024-11-28 12:38:46.593653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4f40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.593678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.593737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.593751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.593807] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.593820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.708 #19 NEW cov: 12490 ft: 14892 corp: 11/319b lim: 40 exec/s: 0 rss: 74Mb L: 28/39 MS: 1 ChangeBit- 00:08:16.708 [2024-11-28 12:38:46.653832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.653857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.653914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.653928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.653984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.653997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.654055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.654068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.708 #20 NEW cov: 12490 ft: 14953 corp: 12/358b lim: 40 exec/s: 0 rss: 74Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:16.708 [2024-11-28 12:38:46.693694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.693719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.693790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:1c008000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.693807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.693863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.693876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.708 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:16.708 #21 NEW cov: 12513 ft: 14999 corp: 13/386b lim: 40 exec/s: 0 rss: 74Mb L: 28/39 MS: 1 ChangeBinInt- 00:08:16.708 [2024-11-28 12:38:46.753868] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.753893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.753949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.753963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.754034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.754048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.754103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.754116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.708 #22 NEW cov: 12513 ft: 15040 corp: 14/425b lim: 40 exec/s: 0 rss: 74Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:16.708 [2024-11-28 12:38:46.813926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.813951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.814008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.814022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.814076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.814089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.708 [2024-11-28 12:38:46.814142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.708 [2024-11-28 12:38:46.814155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.968 #23 NEW cov: 12513 ft: 15087 corp: 15/464b lim: 40 exec/s: 23 rss: 75Mb L: 39/39 MS: 1 ShuffleBytes- 00:08:16.968 [2024-11-28 12:38:46.873917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.873942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.874003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.874016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.874070] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.874084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.874136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.874149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.968 #24 NEW cov: 12513 ft: 15143 corp: 16/503b lim: 40 exec/s: 24 rss: 75Mb L: 39/39 MS: 1 ChangeByte- 00:08:16.968 [2024-11-28 12:38:46.914077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:6f292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.914102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.914160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.914174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.914230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.914244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.914299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.914311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.914367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:29292929 cdw11:2929064a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.914380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:16.968 #25 NEW cov: 12513 ft: 15279 corp: 17/543b lim: 40 exec/s: 25 rss: 75Mb L: 40/40 MS: 1 InsertByte- 00:08:16.968 [2024-11-28 12:38:46.973982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.974006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.974065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.974079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.974135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:2929b729 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.974148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:46.974204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:46.974220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.968 #26 NEW cov: 12513 ft: 15330 corp: 18/582b lim: 40 exec/s: 26 rss: 75Mb L: 39/40 MS: 1 ChangeByte- 00:08:16.968 [2024-11-28 12:38:47.013994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.014019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:47.014077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.014090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:47.014162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.014176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:47.014234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.014247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:16.968 #27 NEW cov: 12513 ft: 15337 corp: 19/619b lim: 40 exec/s: 27 rss: 75Mb L: 37/40 MS: 1 EraseBytes- 00:08:16.968 [2024-11-28 12:38:47.053866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.053890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:47.053945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.053959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.968 [2024-11-28 12:38:47.054015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00f6ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.968 [2024-11-28 12:38:47.054028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.968 #33 NEW cov: 12513 ft: 15393 corp: 20/644b lim: 40 exec/s: 33 rss: 75Mb L: 25/40 MS: 1 ChangeBinInt- 00:08:17.228 [2024-11-28 12:38:47.094262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:6f292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.094287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.094346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.094360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.094416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.094430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.094490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.094506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.094559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:29292929 cdw11:2929304a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.094573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.228 #34 NEW cov: 12513 ft: 15414 corp: 21/684b lim: 40 exec/s: 34 rss: 75Mb L: 40/40 MS: 1 ChangeByte- 00:08:17.228 [2024-11-28 12:38:47.153912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.153937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.153998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:1c008000 cdw11:99000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.154011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.154066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.154079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.228 #40 NEW cov: 12513 ft: 15434 corp: 22/712b lim: 40 exec/s: 40 rss: 75Mb L: 28/40 MS: 1 ChangeByte- 00:08:17.228 [2024-11-28 12:38:47.214090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.214116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.214188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.214203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.214258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.214272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.214325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:292929d7 cdw11:d6d6d6d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.214339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.228 #41 NEW cov: 12513 ft: 15437 corp: 23/751b lim: 40 exec/s: 41 rss: 75Mb L: 39/40 MS: 1 ChangeBinInt- 00:08:17.228 [2024-11-28 12:38:47.253965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.253990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.254046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.254059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.254115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00290000 cdw11:00f6ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.254131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.228 #42 NEW cov: 12513 ft: 15457 corp: 24/776b lim: 40 exec/s: 42 rss: 75Mb L: 25/40 MS: 1 CrossOver- 00:08:17.228 [2024-11-28 12:38:47.314137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.314163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.314223] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.314238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.314293] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.314306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.228 [2024-11-28 12:38:47.314363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.228 [2024-11-28 12:38:47.314377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.488 #43 NEW cov: 12513 ft: 15471 corp: 25/813b lim: 40 exec/s: 43 rss: 75Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:17.488 [2024-11-28 12:38:47.373997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a4b4b4b cdw11:4b4b4b4b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.374023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.374078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:4b4b4b4b cdw11:4b4b4b4b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.374091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.374148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:4b4b4b4b cdw11:4b4b4b4b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.374161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.488 #44 NEW cov: 12513 ft: 15479 corp: 26/839b lim: 40 exec/s: 44 rss: 75Mb L: 26/40 MS: 1 InsertRepeatedBytes- 00:08:17.488 [2024-11-28 12:38:47.414176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.414201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.414259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.414272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.414329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.414342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.414398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.414418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.488 #45 NEW cov: 12513 ft: 15491 corp: 27/876b lim: 40 exec/s: 45 rss: 75Mb L: 37/40 MS: 1 ShuffleBytes- 00:08:17.488 [2024-11-28 12:38:47.474376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:2929ff29 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.474400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.474458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.474475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.474532] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.474544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.474601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.474614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.474670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:29292929 cdw11:2929064a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.474683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.488 #46 NEW cov: 12513 ft: 15532 corp: 28/916b lim: 40 exec/s: 46 rss: 75Mb L: 40/40 MS: 1 InsertByte- 00:08:17.488 [2024-11-28 12:38:47.514066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4f40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.514091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.514146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.514160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.488 [2024-11-28 12:38:47.514219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.488 [2024-11-28 12:38:47.514231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.488 #47 NEW cov: 12513 ft: 15543 corp: 29/944b lim: 40 exec/s: 47 rss: 75Mb L: 28/40 MS: 1 ShuffleBytes- 00:08:17.488 [2024-11-28 12:38:47.553788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4000000 cdw11:003a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.489 [2024-11-28 12:38:47.553811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.489 #48 NEW cov: 12513 ft: 15612 corp: 30/958b lim: 40 exec/s: 48 rss: 75Mb L: 14/40 MS: 1 ChangeByte- 00:08:17.489 [2024-11-28 12:38:47.594109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4f40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.489 [2024-11-28 12:38:47.594133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.489 [2024-11-28 12:38:47.594195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.489 [2024-11-28 12:38:47.594209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.489 [2024-11-28 12:38:47.594279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000400 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.489 [2024-11-28 12:38:47.594293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.748 #49 NEW cov: 12513 ft: 15680 corp: 31/986b lim: 40 exec/s: 49 rss: 75Mb L: 28/40 MS: 1 ChangeBit- 00:08:17.748 [2024-11-28 12:38:47.634456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:6f292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.634485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.634543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.634563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.634617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.634630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.634686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.634699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.634754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:29292929 cdw11:2929354a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.634767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:17.748 #50 NEW cov: 12513 ft: 15693 corp: 32/1026b lim: 40 exec/s: 50 rss: 75Mb L: 40/40 MS: 1 ChangeASCIIInt- 00:08:17.748 [2024-11-28 12:38:47.694213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4f40000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.694239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.694299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00008000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.694313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.694368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:00001c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.694381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.748 #51 NEW cov: 12513 ft: 15700 corp: 33/1054b lim: 40 exec/s: 51 rss: 75Mb L: 28/40 MS: 1 ChangeBinInt- 00:08:17.748 [2024-11-28 12:38:47.733881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f4000000 cdw11:0e000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.733907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.748 #52 NEW cov: 12513 ft: 15754 corp: 34/1068b lim: 40 exec/s: 52 rss: 75Mb L: 14/40 MS: 1 ChangeBinInt- 00:08:17.748 [2024-11-28 12:38:47.774033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f40000ff cdw11:0700003a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.774059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.774117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.774131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.748 #53 NEW cov: 12513 ft: 15957 corp: 35/1084b lim: 40 exec/s: 53 rss: 75Mb L: 16/40 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:17.748 [2024-11-28 12:38:47.834407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.834433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.834507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.834522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.834577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.834591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.748 [2024-11-28 12:38:47.834648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:292929d7 cdw11:d6d6d6d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:17.748 [2024-11-28 12:38:47.834661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.008 #54 NEW cov: 12513 ft: 15983 corp: 36/1123b lim: 40 exec/s: 27 rss: 75Mb L: 39/40 MS: 1 ShuffleBytes- 00:08:18.008 #54 DONE cov: 12513 ft: 15983 corp: 36/1123b lim: 40 exec/s: 27 rss: 75Mb 00:08:18.008 ###### Recommended dictionary. ###### 00:08:18.008 "\377\007" # Uses: 4 00:08:18.008 ###### End of recommended dictionary. ###### 00:08:18.008 Done 54 runs in 2 second(s) 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 12 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4412 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:18.008 12:38:47 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:18.008 [2024-11-28 12:38:48.027278] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:18.008 [2024-11-28 12:38:48.027353] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid610382 ] 00:08:18.267 [2024-11-28 12:38:48.346592] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:18.527 [2024-11-28 12:38:48.393897] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.527 [2024-11-28 12:38:48.413432] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.527 [2024-11-28 12:38:48.465967] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.527 [2024-11-28 12:38:48.482074] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:18.527 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.527 INFO: Seed: 2971123100 00:08:18.527 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:18.527 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:18.527 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:18.527 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.527 #2 INITED exec/s: 0 rss: 67Mb 00:08:18.527 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.527 This may also happen if the target rejected all inputs we tried so far 00:08:18.527 [2024-11-28 12:38:48.527085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.527 [2024-11-28 12:38:48.527119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.527 [2024-11-28 12:38:48.527169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.527 [2024-11-28 12:38:48.527185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.527 [2024-11-28 12:38:48.527216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.527 [2024-11-28 12:38:48.527232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.527 [2024-11-28 12:38:48.527261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.527 [2024-11-28 12:38:48.527276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.787 NEW_FUNC[1/717]: 0x46f6a8 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:18.787 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.787 #4 NEW cov: 12284 ft: 12283 corp: 2/36b lim: 40 exec/s: 0 rss: 74Mb L: 35/35 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:18.787 [2024-11-28 12:38:48.877130] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.787 [2024-11-28 12:38:48.877174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.787 [2024-11-28 12:38:48.877224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.787 [2024-11-28 12:38:48.877240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.787 [2024-11-28 12:38:48.877271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.787 [2024-11-28 12:38:48.877288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.787 [2024-11-28 12:38:48.877318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:18.787 [2024-11-28 12:38:48.877333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.046 #15 NEW cov: 12397 ft: 12673 corp: 3/71b lim: 40 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ChangeByte- 00:08:19.046 [2024-11-28 12:38:48.967073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:48.967105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:48.967139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff535353 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:48.967154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:48.967184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:48.967200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:48.967229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:48.967244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.046 #16 NEW cov: 12403 ft: 12957 corp: 4/109b lim: 40 exec/s: 0 rss: 75Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:19.046 [2024-11-28 12:38:49.056937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:49.056969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:49.057003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:535353ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:49.057019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.046 #20 NEW cov: 12488 ft: 13706 corp: 5/128b lim: 40 exec/s: 0 rss: 75Mb L: 19/38 MS: 4 ChangeBit-InsertByte-InsertByte-CrossOver- 00:08:19.046 [2024-11-28 12:38:49.117030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:49.117065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:49.117114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.046 [2024-11-28 12:38:49.117130] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.046 [2024-11-28 12:38:49.117160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.047 [2024-11-28 12:38:49.117176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.047 [2024-11-28 12:38:49.117205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff2300ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.047 [2024-11-28 12:38:49.117220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.047 #21 NEW cov: 12488 ft: 13763 corp: 6/163b lim: 40 exec/s: 0 rss: 75Mb L: 35/38 MS: 1 ChangeBinInt- 00:08:19.047 [2024-11-28 12:38:49.166909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.047 [2024-11-28 12:38:49.166939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.047 [2024-11-28 12:38:49.166988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:5353ad00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.047 [2024-11-28 12:38:49.167003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.306 #22 NEW cov: 12488 ft: 13837 corp: 7/182b lim: 40 exec/s: 0 rss: 75Mb L: 19/38 MS: 1 ChangeBinInt- 00:08:19.306 [2024-11-28 12:38:49.256930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.306 [2024-11-28 12:38:49.256962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.306 [2024-11-28 12:38:49.257010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.306 [2024-11-28 12:38:49.257026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.306 #23 NEW cov: 12488 ft: 13923 corp: 8/205b lim: 40 exec/s: 0 rss: 75Mb L: 23/38 MS: 1 EraseBytes- 00:08:19.306 [2024-11-28 12:38:49.347024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.306 [2024-11-28 12:38:49.347054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.306 [2024-11-28 12:38:49.347102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.306 [2024-11-28 12:38:49.347118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.306 [2024-11-28 12:38:49.347148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffff0a cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.306 [2024-11-28 12:38:49.347164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.306 #29 NEW cov: 12488 ft: 14157 corp: 9/233b lim: 40 exec/s: 0 rss: 75Mb L: 28/38 MS: 1 CrossOver- 00:08:19.307 [2024-11-28 12:38:49.407152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.307 [2024-11-28 12:38:49.407185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.307 [2024-11-28 12:38:49.407218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:fffffdff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.307 [2024-11-28 12:38:49.407234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.307 [2024-11-28 12:38:49.407264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.307 [2024-11-28 12:38:49.407279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.307 [2024-11-28 12:38:49.407308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.307 [2024-11-28 12:38:49.407323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.566 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:19.566 #30 NEW cov: 12505 ft: 14234 corp: 10/268b lim: 40 exec/s: 0 rss: 75Mb L: 35/38 MS: 1 ChangeBit- 00:08:19.566 [2024-11-28 12:38:49.466952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.466982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.566 #31 NEW cov: 12505 ft: 14962 corp: 11/282b lim: 40 exec/s: 0 rss: 75Mb L: 14/38 MS: 1 EraseBytes- 00:08:19.566 [2024-11-28 12:38:49.527123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.527153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.527201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.527217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.527247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.527263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.527291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.527306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.566 #32 NEW cov: 12505 ft: 15034 corp: 12/321b lim: 40 exec/s: 32 rss: 75Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:19.566 [2024-11-28 12:38:49.577152] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.577182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.577230] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff535353 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.577251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.577281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.577296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.577325] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.577340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.566 #33 NEW cov: 12505 ft: 15059 corp: 13/360b lim: 40 exec/s: 33 rss: 75Mb L: 39/39 MS: 1 InsertByte- 00:08:19.566 [2024-11-28 12:38:49.637030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a1300 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.637063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.566 [2024-11-28 12:38:49.637112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:5353ad00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.566 [2024-11-28 12:38:49.637129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.825 #34 NEW cov: 12505 ft: 15097 corp: 14/379b lim: 40 exec/s: 34 rss: 75Mb L: 19/39 MS: 1 ChangeBinInt- 00:08:19.825 [2024-11-28 12:38:49.726996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.825 [2024-11-28 12:38:49.727025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.825 #40 NEW cov: 12505 ft: 15141 corp: 15/393b lim: 40 exec/s: 40 rss: 75Mb L: 14/39 MS: 1 ChangeBit- 00:08:19.825 [2024-11-28 12:38:49.817270] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:7a8a1300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.825 [2024-11-28 12:38:49.817300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.825 [2024-11-28 12:38:49.817349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.825 [2024-11-28 12:38:49.817364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.825 [2024-11-28 12:38:49.817394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5353ad00 cdw11:0000ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.825 [2024-11-28 12:38:49.817410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.825 [2024-11-28 12:38:49.817439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.826 [2024-11-28 12:38:49.817454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.826 #41 NEW cov: 12505 ft: 15181 corp: 16/428b lim: 40 exec/s: 41 rss: 75Mb L: 35/39 MS: 1 CrossOver- 00:08:19.826 [2024-11-28 12:38:49.867158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffefff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.826 [2024-11-28 12:38:49.867188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.826 [2024-11-28 12:38:49.867236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:5353ad00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.826 [2024-11-28 12:38:49.867255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.826 #42 NEW cov: 12505 ft: 15191 corp: 17/447b lim: 40 exec/s: 42 rss: 75Mb L: 19/39 MS: 1 ChangeBit- 00:08:19.826 [2024-11-28 12:38:49.917174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.826 [2024-11-28 12:38:49.917205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.826 [2024-11-28 12:38:49.917253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.826 [2024-11-28 12:38:49.917270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.085 #43 NEW cov: 12505 ft: 15202 corp: 18/467b lim: 40 exec/s: 43 rss: 75Mb L: 20/39 MS: 1 EraseBytes- 00:08:20.085 [2024-11-28 12:38:50.007400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.007432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.007467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff535353 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.007491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.007522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.007538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.007568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.007583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.085 #44 NEW cov: 12505 ft: 15260 corp: 19/505b lim: 40 exec/s: 44 rss: 75Mb L: 38/39 MS: 1 ChangeBit- 00:08:20.085 [2024-11-28 12:38:50.067463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.067507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.067544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.067561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.067591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.085 [2024-11-28 12:38:50.067607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.085 [2024-11-28 12:38:50.067637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff2300ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.086 [2024-11-28 12:38:50.067652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.086 #45 NEW cov: 12505 ft: 15263 corp: 20/543b lim: 40 exec/s: 45 rss: 75Mb L: 38/39 MS: 1 CopyPart- 00:08:20.086 [2024-11-28 12:38:50.157445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:ffffefff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.086 [2024-11-28 12:38:50.157487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.086 [2024-11-28 12:38:50.157538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:5353ad00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.086 [2024-11-28 12:38:50.157554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.086 [2024-11-28 12:38:50.157585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:0000ff0e cdw11:0e0e0e0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.086 [2024-11-28 12:38:50.157601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.086 [2024-11-28 12:38:50.157631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:0e0e0e0e cdw11:0e0e0e0e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.086 [2024-11-28 12:38:50.157647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.344 #46 NEW cov: 12505 ft: 15303 corp: 21/575b lim: 40 exec/s: 46 rss: 75Mb L: 32/39 MS: 1 InsertRepeatedBytes- 00:08:20.344 [2024-11-28 12:38:50.247227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.247259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.344 #47 NEW cov: 12505 ft: 15367 corp: 22/588b lim: 40 exec/s: 47 rss: 75Mb L: 13/39 MS: 1 CrossOver- 00:08:20.344 [2024-11-28 12:38:50.307241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.307271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.344 #48 NEW cov: 12505 ft: 15390 corp: 23/601b lim: 40 exec/s: 48 rss: 75Mb L: 13/39 MS: 1 ChangeByte- 00:08:20.344 [2024-11-28 12:38:50.407504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.407537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.344 [2024-11-28 12:38:50.407571] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.407588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.344 [2024-11-28 12:38:50.407618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.407634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.344 [2024-11-28 12:38:50.407665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ff2300ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.344 [2024-11-28 12:38:50.407681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.604 #49 NEW cov: 12512 ft: 15419 corp: 24/639b lim: 40 exec/s: 49 rss: 75Mb L: 38/39 MS: 1 CopyPart- 00:08:20.604 [2024-11-28 12:38:50.497301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:7a8a0aff cdw11:fffffffb SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.604 [2024-11-28 12:38:50.497336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.604 #50 NEW cov: 12512 ft: 15473 corp: 25/653b lim: 40 exec/s: 25 rss: 76Mb L: 14/39 MS: 1 ChangeBit- 00:08:20.604 #50 DONE cov: 12512 ft: 15473 corp: 25/653b lim: 40 exec/s: 25 rss: 76Mb 00:08:20.604 Done 50 runs in 2 second(s) 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 13 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4413 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:20.604 12:38:50 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:20.604 [2024-11-28 12:38:50.724243] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:20.604 [2024-11-28 12:38:50.724318] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid610738 ] 00:08:21.173 [2024-11-28 12:38:51.043287] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:21.173 [2024-11-28 12:38:51.089519] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.173 [2024-11-28 12:38:51.105190] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.173 [2024-11-28 12:38:51.157701] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.173 [2024-11-28 12:38:51.173812] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:21.173 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.173 INFO: Seed: 1368152905 00:08:21.173 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:21.173 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:21.173 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:21.173 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.173 #2 INITED exec/s: 0 rss: 66Mb 00:08:21.173 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.173 This may also happen if the target rejected all inputs we tried so far 00:08:21.173 [2024-11-28 12:38:51.229551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.173 [2024-11-28 12:38:51.229579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.173 [2024-11-28 12:38:51.229635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.173 [2024-11-28 12:38:51.229649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.173 [2024-11-28 12:38:51.229705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.173 [2024-11-28 12:38:51.229719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.173 [2024-11-28 12:38:51.229776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.173 [2024-11-28 12:38:51.229789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.432 NEW_FUNC[1/716]: 0x471278 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:21.432 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.432 #3 NEW cov: 12272 ft: 12271 corp: 2/40b lim: 40 exec/s: 0 rss: 73Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:08:21.692 [2024-11-28 12:38:51.559273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.559321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 #8 NEW cov: 12385 ft: 13487 corp: 3/55b lim: 40 exec/s: 0 rss: 74Mb L: 15/39 MS: 5 ChangeByte-CrossOver-CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:21.692 [2024-11-28 12:38:51.599651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.599677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.599749] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.599763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.599815] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.599828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.599882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.599895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.599952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.599967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:21.692 #9 NEW cov: 12391 ft: 13782 corp: 4/95b lim: 40 exec/s: 0 rss: 74Mb L: 40/40 MS: 1 CopyPart- 00:08:21.692 [2024-11-28 12:38:51.659421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.659446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.659505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.659519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.659577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29293f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.659590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.692 #10 NEW cov: 12476 ft: 14291 corp: 5/125b lim: 40 exec/s: 0 rss: 74Mb L: 30/40 MS: 1 InsertRepeatedBytes- 00:08:21.692 [2024-11-28 12:38:51.719450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.719481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.719539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.719553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.719607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.719620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.692 #11 NEW cov: 12476 ft: 14365 corp: 6/151b lim: 40 exec/s: 0 rss: 74Mb L: 26/40 MS: 1 EraseBytes- 00:08:21.692 [2024-11-28 12:38:51.759370] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a410b3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.759396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.759467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.759487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.692 #14 NEW cov: 12476 ft: 14630 corp: 7/168b lim: 40 exec/s: 0 rss: 74Mb L: 17/40 MS: 3 InsertByte-ChangeByte-CrossOver- 00:08:21.692 [2024-11-28 12:38:51.799581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.799605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.799664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.799677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.799736] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.799749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.692 [2024-11-28 12:38:51.799804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.692 [2024-11-28 12:38:51.799817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.952 #15 NEW cov: 12476 ft: 14685 corp: 8/201b lim: 40 exec/s: 0 rss: 74Mb L: 33/40 MS: 1 CrossOver- 00:08:21.952 [2024-11-28 12:38:51.859503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.859528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.859585] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:77f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.859598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.859652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.859666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.952 #16 NEW cov: 12476 ft: 14720 corp: 9/227b lim: 40 exec/s: 0 rss: 74Mb L: 26/40 MS: 1 ChangeBit- 00:08:21.952 [2024-11-28 12:38:51.919660] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.919685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.919740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f29a9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.919754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.919811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.919824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.919882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.919895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.952 #17 NEW cov: 12476 ft: 14749 corp: 10/260b lim: 40 exec/s: 0 rss: 74Mb L: 33/40 MS: 1 ChangeBit- 00:08:21.952 [2024-11-28 12:38:51.979543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.979567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.979639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.979653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:51.979714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:51.979727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.952 #18 NEW cov: 12476 ft: 14777 corp: 11/288b lim: 40 exec/s: 0 rss: 74Mb L: 28/40 MS: 1 EraseBytes- 00:08:21.952 [2024-11-28 12:38:52.019273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0bffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:52.019298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.952 #19 NEW cov: 12476 ft: 14906 corp: 12/303b lim: 40 exec/s: 0 rss: 74Mb L: 15/40 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:21.952 [2024-11-28 12:38:52.059596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:52.059622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:52.059680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:52.059694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.952 [2024-11-28 12:38:52.059750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:1629293f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:21.952 [2024-11-28 12:38:52.059763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 #20 NEW cov: 12476 ft: 14917 corp: 13/334b lim: 40 exec/s: 0 rss: 74Mb L: 31/40 MS: 1 InsertByte- 00:08:22.212 [2024-11-28 12:38:52.099773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.099799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.099870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f29a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.099884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.099943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.099957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.100013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.100027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.212 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:22.212 #21 NEW cov: 12499 ft: 15006 corp: 14/367b lim: 40 exec/s: 0 rss: 74Mb L: 33/40 MS: 1 ChangeBit- 00:08:22.212 [2024-11-28 12:38:52.159806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.159832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.159908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.159922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.159977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.159991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.160047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.160060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.212 #22 NEW cov: 12499 ft: 15122 corp: 15/400b lim: 40 exec/s: 0 rss: 74Mb L: 33/40 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:22.212 [2024-11-28 12:38:52.199814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.199839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.199913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.199927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.199983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f727 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.199996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.200051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:000000f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.200064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.212 #23 NEW cov: 12499 ft: 15163 corp: 16/439b lim: 40 exec/s: 23 rss: 74Mb L: 39/40 MS: 1 ChangeBinInt- 00:08:22.212 [2024-11-28 12:38:52.239820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.239846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.239917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.239932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.239990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.240003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.240059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.240072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.212 #24 NEW cov: 12499 ft: 15209 corp: 17/472b lim: 40 exec/s: 24 rss: 74Mb L: 33/40 MS: 1 ChangeByte- 00:08:22.212 [2024-11-28 12:38:52.279861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.279887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.279944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.279958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.280012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.280025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.212 [2024-11-28 12:38:52.280081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.212 [2024-11-28 12:38:52.280094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.212 #25 NEW cov: 12499 ft: 15259 corp: 18/505b lim: 40 exec/s: 25 rss: 74Mb L: 33/40 MS: 1 ShuffleBytes- 00:08:22.472 [2024-11-28 12:38:52.339799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f13 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.339825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.339882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.339896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.339953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29293f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.339966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.472 #26 NEW cov: 12499 ft: 15286 corp: 19/535b lim: 40 exec/s: 26 rss: 74Mb L: 30/40 MS: 1 ChangeByte- 00:08:22.472 [2024-11-28 12:38:52.379737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.379761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.379819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.379832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.379890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.379903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.472 #27 NEW cov: 12499 ft: 15288 corp: 20/563b lim: 40 exec/s: 27 rss: 74Mb L: 28/40 MS: 1 ShuffleBytes- 00:08:22.472 [2024-11-28 12:38:52.439755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.439783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.439841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:77f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.439854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.439926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f727 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.439939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.472 #28 NEW cov: 12499 ft: 15309 corp: 21/589b lim: 40 exec/s: 28 rss: 74Mb L: 26/40 MS: 1 ChangeByte- 00:08:22.472 [2024-11-28 12:38:52.499951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.499976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.500047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.500061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.500116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.500129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.500184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:292d2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.500197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.472 #29 NEW cov: 12499 ft: 15317 corp: 22/623b lim: 40 exec/s: 29 rss: 75Mb L: 34/40 MS: 1 InsertByte- 00:08:22.472 [2024-11-28 12:38:52.539659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.539684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.539742] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.539755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 #30 NEW cov: 12499 ft: 15322 corp: 23/640b lim: 40 exec/s: 30 rss: 75Mb L: 17/40 MS: 1 EraseBytes- 00:08:22.472 [2024-11-28 12:38:52.579800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:29293f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.579824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.579881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:293f2929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.579894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.472 [2024-11-28 12:38:52.579950] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:1629293f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.472 [2024-11-28 12:38:52.579967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.731 #31 NEW cov: 12499 ft: 15340 corp: 24/671b lim: 40 exec/s: 31 rss: 75Mb L: 31/40 MS: 1 ShuffleBytes- 00:08:22.731 [2024-11-28 12:38:52.639878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.639902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.639960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.639974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.640030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.640043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.731 #32 NEW cov: 12499 ft: 15361 corp: 25/699b lim: 40 exec/s: 32 rss: 75Mb L: 28/40 MS: 1 ShuffleBytes- 00:08:22.731 [2024-11-28 12:38:52.699873] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.699897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.699953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.699966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.700039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.700053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.731 #33 NEW cov: 12499 ft: 15398 corp: 26/725b lim: 40 exec/s: 33 rss: 75Mb L: 26/40 MS: 1 EraseBytes- 00:08:22.731 [2024-11-28 12:38:52.760020] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.760044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.760102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.760116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.760171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.731 [2024-11-28 12:38:52.760184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.731 [2024-11-28 12:38:52.760239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:f7f70000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.760252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.732 #34 NEW cov: 12499 ft: 15404 corp: 27/764b lim: 40 exec/s: 34 rss: 75Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:08:22.732 [2024-11-28 12:38:52.799598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.799621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.732 #35 NEW cov: 12499 ft: 15423 corp: 28/779b lim: 40 exec/s: 35 rss: 75Mb L: 15/40 MS: 1 CMP- DE: "\005\000"- 00:08:22.732 [2024-11-28 12:38:52.840043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.840067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.732 [2024-11-28 12:38:52.840124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:3f3f2929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.840138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.732 [2024-11-28 12:38:52.840195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.840209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.732 [2024-11-28 12:38:52.840264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:2909ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.732 [2024-11-28 12:38:52.840277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.991 #36 NEW cov: 12499 ft: 15444 corp: 29/813b lim: 40 exec/s: 36 rss: 75Mb L: 34/40 MS: 1 InsertByte- 00:08:22.991 [2024-11-28 12:38:52.899936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f7f3f29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.899961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:52.900017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.900030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:52.900088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:1629293f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.900101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.991 #37 NEW cov: 12499 ft: 15482 corp: 30/844b lim: 40 exec/s: 37 rss: 75Mb L: 31/40 MS: 1 ChangeBit- 00:08:22.991 [2024-11-28 12:38:52.940056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3f3f cdw11:3f0b3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.940079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:52.940137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f20 cdw11:20203f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.940151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:52.940207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3f292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.940223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:52.940279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.940292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:22.991 #38 NEW cov: 12499 ft: 15498 corp: 31/880b lim: 40 exec/s: 38 rss: 75Mb L: 36/40 MS: 1 InsertRepeatedBytes- 00:08:22.991 [2024-11-28 12:38:52.999724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:52.999748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.991 #39 NEW cov: 12499 ft: 15556 corp: 32/895b lim: 40 exec/s: 39 rss: 75Mb L: 15/40 MS: 1 ChangeByte- 00:08:22.991 [2024-11-28 12:38:53.060039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0af7f7f7 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:53.060064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:53.060121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:f7f7f7f7 cdw11:77f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:53.060135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.991 [2024-11-28 12:38:53.060191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:f7f7f727 cdw11:f7f7f7f7 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:22.991 [2024-11-28 12:38:53.060205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:22.991 #40 NEW cov: 12499 ft: 15578 corp: 33/921b lim: 40 exec/s: 40 rss: 75Mb L: 26/40 MS: 1 ChangeByte- 00:08:23.251 [2024-11-28 12:38:53.119968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:3f292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.119995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.120054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.120067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.251 #41 NEW cov: 12499 ft: 15628 corp: 34/941b lim: 40 exec/s: 41 rss: 75Mb L: 20/40 MS: 1 EraseBytes- 00:08:23.251 [2024-11-28 12:38:53.160123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:363f3ff7 cdw11:3f3f0b3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.160148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.160209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:3f3f3f3f cdw11:2020203f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.160223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.160278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:3f3f2929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.160291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.160352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:29292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.160365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:23.251 #42 NEW cov: 12499 ft: 15635 corp: 35/978b lim: 40 exec/s: 42 rss: 75Mb L: 37/40 MS: 1 InsertByte- 00:08:23.251 [2024-11-28 12:38:53.220052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0b3f3f3f cdw11:3f3f3f29 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.220078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.220137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:13292929 cdw11:29292929 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.251 [2024-11-28 12:38:53.220151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.251 [2024-11-28 12:38:53.220209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:29292929 cdw11:29293f3f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:23.252 [2024-11-28 12:38:53.220223] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.252 #43 NEW cov: 12499 ft: 15641 corp: 36/1008b lim: 40 exec/s: 21 rss: 75Mb L: 30/40 MS: 1 ShuffleBytes- 00:08:23.252 #43 DONE cov: 12499 ft: 15641 corp: 36/1008b lim: 40 exec/s: 21 rss: 75Mb 00:08:23.252 ###### Recommended dictionary. ###### 00:08:23.252 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:23.252 "\005\000" # Uses: 0 00:08:23.252 ###### End of recommended dictionary. ###### 00:08:23.252 Done 43 runs in 2 second(s) 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 14 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4414 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:23.252 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.514 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:23.514 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:23.514 12:38:53 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:23.514 [2024-11-28 12:38:53.407292] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:23.514 [2024-11-28 12:38:53.407367] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611092 ] 00:08:23.773 [2024-11-28 12:38:53.725586] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:23.773 [2024-11-28 12:38:53.772216] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.773 [2024-11-28 12:38:53.792717] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.773 [2024-11-28 12:38:53.845222] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.773 [2024-11-28 12:38:53.861328] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:23.773 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.773 INFO: Seed: 4055152978 00:08:23.773 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:23.773 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:23.773 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:23.773 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.773 #2 INITED exec/s: 0 rss: 66Mb 00:08:23.773 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.773 This may also happen if the target rejected all inputs we tried so far 00:08:24.033 [2024-11-28 12:38:53.916706] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.033 [2024-11-28 12:38:53.916735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.293 NEW_FUNC[1/717]: 0x472e48 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:24.293 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:24.293 #12 NEW cov: 12265 ft: 12233 corp: 2/8b lim: 35 exec/s: 0 rss: 73Mb L: 7/7 MS: 5 InsertByte-InsertByte-ChangeByte-CopyPart-InsertByte- 00:08:24.293 [2024-11-28 12:38:54.236890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.293 [2024-11-28 12:38:54.236925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.293 #13 NEW cov: 12379 ft: 12840 corp: 3/15b lim: 35 exec/s: 0 rss: 74Mb L: 7/7 MS: 1 ChangeBinInt- 00:08:24.293 [2024-11-28 12:38:54.297453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.293 [2024-11-28 12:38:54.297487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.293 [2024-11-28 12:38:54.297564] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.297581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.297642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.297657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.297717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.297736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.297798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.297812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.294 #22 NEW cov: 12385 ft: 13902 corp: 4/50b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 4 InsertByte-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:24.294 [2024-11-28 12:38:54.337455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.337489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.337551] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.337567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.337627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.337641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.337699] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.337714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.337772] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.337788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.294 #23 NEW cov: 12477 ft: 14118 corp: 5/85b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:24.294 [2024-11-28 12:38:54.397455] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.397488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.397566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.397583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.397647] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.397660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.397721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.397737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.294 [2024-11-28 12:38:54.397797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.294 [2024-11-28 12:38:54.397813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.553 #24 NEW cov: 12477 ft: 14252 corp: 6/120b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:24.553 [2024-11-28 12:38:54.457488] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.457516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.457578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.457595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.457657] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.457670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.457728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.457743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.457800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.457816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.554 #25 NEW cov: 12477 ft: 14478 corp: 7/155b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:24.554 [2024-11-28 12:38:54.496963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.496990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.497053] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.497069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.554 #26 NEW cov: 12477 ft: 14791 corp: 8/169b lim: 35 exec/s: 0 rss: 74Mb L: 14/35 MS: 1 CopyPart- 00:08:24.554 [2024-11-28 12:38:54.537527] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.537557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.537622] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.537639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.537702] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.537716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.537776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.537792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.554 [2024-11-28 12:38:54.537851] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.537867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.554 #27 NEW cov: 12477 ft: 14809 corp: 9/204b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:24.554 [2024-11-28 12:38:54.576934] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.576964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.554 #28 NEW cov: 12477 ft: 14854 corp: 10/215b lim: 35 exec/s: 0 rss: 74Mb L: 11/35 MS: 1 CMP- DE: "\000\000\000\007"- 00:08:24.554 [2024-11-28 12:38:54.636908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.554 [2024-11-28 12:38:54.636937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.813 #29 NEW cov: 12477 ft: 14880 corp: 11/226b lim: 35 exec/s: 0 rss: 74Mb L: 11/35 MS: 1 ShuffleBytes- 00:08:24.813 [2024-11-28 12:38:54.697636] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.697665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.813 [2024-11-28 12:38:54.697745] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.697763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.813 [2024-11-28 12:38:54.697825] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.697839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.813 [2024-11-28 12:38:54.697899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.697915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.813 [2024-11-28 12:38:54.697975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.697992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.813 #30 NEW cov: 12477 ft: 14937 corp: 12/261b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:24.813 [2024-11-28 12:38:54.757005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.813 [2024-11-28 12:38:54.757033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.813 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:24.813 #31 NEW cov: 12500 ft: 15009 corp: 13/272b lim: 35 exec/s: 0 rss: 74Mb L: 11/35 MS: 1 CMP- DE: "\377\377\377\377\377\377\003\000"- 00:08:24.814 [2024-11-28 12:38:54.817701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.817727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.817805] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.817822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.817881] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000029 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.817898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.817959] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.817974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.818034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.818050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:24.814 #32 NEW cov: 12500 ft: 15085 corp: 14/307b lim: 35 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:24.814 [2024-11-28 12:38:54.857186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.857214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.857288] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.857304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.814 #33 NEW cov: 12500 ft: 15089 corp: 15/321b lim: 35 exec/s: 0 rss: 74Mb L: 14/35 MS: 1 InsertRepeatedBytes- 00:08:24.814 [2024-11-28 12:38:54.897040] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.897065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.814 #34 NEW cov: 12500 ft: 15181 corp: 16/332b lim: 35 exec/s: 34 rss: 74Mb L: 11/35 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:24.814 [2024-11-28 12:38:54.937289] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.937317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.814 [2024-11-28 12:38:54.937379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.814 [2024-11-28 12:38:54.937396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.073 #35 NEW cov: 12500 ft: 15230 corp: 17/346b lim: 35 exec/s: 35 rss: 74Mb L: 14/35 MS: 1 CopyPart- 00:08:25.073 [2024-11-28 12:38:54.997824] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:54.997851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:54.997929] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:54.997946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:54.998004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:8000007b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:54.998021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:54.998081] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:54.998100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:54.998161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:8 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:54.998178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.073 #36 NEW cov: 12500 ft: 15280 corp: 18/381b lim: 35 exec/s: 36 rss: 74Mb L: 35/35 MS: 1 ChangeByte- 00:08:25.073 [2024-11-28 12:38:55.037463] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.037495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:55.037573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.037589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:55.037648] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.037662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.073 #37 NEW cov: 12500 ft: 15481 corp: 19/403b lim: 35 exec/s: 37 rss: 74Mb L: 22/35 MS: 1 InsertRepeatedBytes- 00:08:25.073 [2024-11-28 12:38:55.077138] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.077165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.073 #38 NEW cov: 12500 ft: 15497 corp: 20/410b lim: 35 exec/s: 38 rss: 74Mb L: 7/35 MS: 1 ChangeByte- 00:08:25.073 [2024-11-28 12:38:55.117345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.117370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:55.117429] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES NUMBER OF QUEUES cid:5 cdw10:00000007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.117445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: COMMAND SEQUENCE ERROR (00/0c) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.073 NEW_FUNC[1/1]: 0x491e48 in feat_number_of_queues /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:318 00:08:25.073 #39 NEW cov: 12534 ft: 15586 corp: 21/425b lim: 35 exec/s: 39 rss: 74Mb L: 15/35 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:25.073 [2024-11-28 12:38:55.177376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.177402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.073 [2024-11-28 12:38:55.177461] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.073 [2024-11-28 12:38:55.177481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.332 #40 NEW cov: 12534 ft: 15596 corp: 22/439b lim: 35 exec/s: 40 rss: 74Mb L: 14/35 MS: 1 ChangeBit- 00:08:25.332 [2024-11-28 12:38:55.217201] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.217229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.332 #41 NEW cov: 12534 ft: 15633 corp: 23/446b lim: 35 exec/s: 41 rss: 74Mb L: 7/35 MS: 1 ChangeBinInt- 00:08:25.332 [2024-11-28 12:38:55.257407] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.257435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.332 [2024-11-28 12:38:55.257514] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES LBA RANGE TYPE cid:5 cdw10:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.257529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.332 NEW_FUNC[1/1]: 0x48f5d8 in feat_lba_range_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:289 00:08:25.332 #42 NEW cov: 12545 ft: 15659 corp: 24/460b lim: 35 exec/s: 42 rss: 75Mb L: 14/35 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\003\000"- 00:08:25.332 [2024-11-28 12:38:55.317331] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.317359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.332 #43 NEW cov: 12545 ft: 15667 corp: 25/471b lim: 35 exec/s: 43 rss: 75Mb L: 11/35 MS: 1 ChangeBit- 00:08:25.332 [2024-11-28 12:38:55.357273] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.357301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.332 #44 NEW cov: 12545 ft: 15681 corp: 26/482b lim: 35 exec/s: 44 rss: 75Mb L: 11/35 MS: 1 ShuffleBytes- 00:08:25.332 [2024-11-28 12:38:55.397449] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.397480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.332 [2024-11-28 12:38:55.397559] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.397577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.332 #45 NEW cov: 12545 ft: 15723 corp: 27/500b lim: 35 exec/s: 45 rss: 75Mb L: 18/35 MS: 1 PersAutoDict- DE: "\000\000\000\007"- 00:08:25.332 [2024-11-28 12:38:55.437307] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.332 [2024-11-28 12:38:55.437333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 #47 NEW cov: 12545 ft: 15762 corp: 28/507b lim: 35 exec/s: 47 rss: 75Mb L: 7/35 MS: 2 EraseBytes-CrossOver- 00:08:25.590 [2024-11-28 12:38:55.477846] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000019 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.477873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 [2024-11-28 12:38:55.477948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.477965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.590 [2024-11-28 12:38:55.478027] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.478044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.590 [2024-11-28 12:38:55.478106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000d6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.478122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.590 #48 NEW cov: 12545 ft: 15821 corp: 29/536b lim: 35 exec/s: 48 rss: 75Mb L: 29/35 MS: 1 EraseBytes- 00:08:25.590 [2024-11-28 12:38:55.517356] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.517383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 #49 NEW cov: 12545 ft: 15835 corp: 30/547b lim: 35 exec/s: 49 rss: 75Mb L: 11/35 MS: 1 ChangeBit- 00:08:25.590 [2024-11-28 12:38:55.577628] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.577658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 [2024-11-28 12:38:55.577714] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES LBA RANGE TYPE cid:5 cdw10:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.577728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.590 #50 NEW cov: 12545 ft: 15889 corp: 31/561b lim: 35 exec/s: 50 rss: 75Mb L: 14/35 MS: 1 ShuffleBytes- 00:08:25.590 [2024-11-28 12:38:55.637428] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:000000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.637452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 #52 NEW cov: 12545 ft: 15896 corp: 32/569b lim: 35 exec/s: 52 rss: 75Mb L: 8/35 MS: 2 EraseBytes-PersAutoDict- DE: "\000\000\000\007"- 00:08:25.590 [2024-11-28 12:38:55.697475] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.590 [2024-11-28 12:38:55.697503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 #53 NEW cov: 12545 ft: 15897 corp: 33/580b lim: 35 exec/s: 53 rss: 75Mb L: 11/35 MS: 1 ChangeBit- 00:08:25.849 [2024-11-28 12:38:55.757665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.849 [2024-11-28 12:38:55.757694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-11-28 12:38:55.757755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.849 [2024-11-28 12:38:55.757772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 #54 NEW cov: 12545 ft: 15913 corp: 34/594b lim: 35 exec/s: 54 rss: 75Mb L: 14/35 MS: 1 InsertRepeatedBytes- 00:08:25.849 [2024-11-28 12:38:55.797645] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.849 [2024-11-28 12:38:55.797674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-11-28 12:38:55.797735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.849 [2024-11-28 12:38:55.797752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 #55 NEW cov: 12545 ft: 15980 corp: 35/608b lim: 35 exec/s: 55 rss: 75Mb L: 14/35 MS: 1 ShuffleBytes- 00:08:25.849 [2024-11-28 12:38:55.857509] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000f4 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.849 [2024-11-28 12:38:55.857543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 #56 NEW cov: 12545 ft: 15996 corp: 36/619b lim: 35 exec/s: 28 rss: 76Mb L: 11/35 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\012"- 00:08:25.849 #56 DONE cov: 12545 ft: 15996 corp: 36/619b lim: 35 exec/s: 28 rss: 76Mb 00:08:25.849 ###### Recommended dictionary. ###### 00:08:25.849 "\000\000\000\007" # Uses: 4 00:08:25.849 "\377\377\377\377\377\377\003\000" # Uses: 1 00:08:25.849 "\000\000\000\000\000\000\000\012" # Uses: 0 00:08:25.849 ###### End of recommended dictionary. ###### 00:08:25.849 Done 56 runs in 2 second(s) 00:08:26.108 12:38:55 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:26.108 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 15 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4415 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:26.109 12:38:56 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:26.109 [2024-11-28 12:38:56.048153] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:26.109 [2024-11-28 12:38:56.048226] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611427 ] 00:08:26.368 [2024-11-28 12:38:56.365073] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:26.368 [2024-11-28 12:38:56.409830] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.368 [2024-11-28 12:38:56.427266] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.368 [2024-11-28 12:38:56.479892] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:26.627 [2024-11-28 12:38:56.496017] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:26.627 INFO: Running with entropic power schedule (0xFF, 100). 00:08:26.627 INFO: Seed: 2395203249 00:08:26.627 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:26.627 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:26.627 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:26.627 INFO: A corpus is not provided, starting from an empty corpus 00:08:26.627 #2 INITED exec/s: 0 rss: 66Mb 00:08:26.627 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:26.627 This may also happen if the target rejected all inputs we tried so far 00:08:26.627 [2024-11-28 12:38:56.551442] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.627 [2024-11-28 12:38:56.551479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.886 NEW_FUNC[1/711]: 0x474388 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:26.886 NEW_FUNC[2/711]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.886 #4 NEW cov: 12224 ft: 12214 corp: 2/10b lim: 35 exec/s: 0 rss: 73Mb L: 9/9 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:26.886 [2024-11-28 12:38:56.871748] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.871784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.886 [2024-11-28 12:38:56.871856] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.871870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.886 [2024-11-28 12:38:56.871935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.871948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.886 NEW_FUNC[1/5]: 0x1981008 in nvme_complete_register_operations /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:725 00:08:26.886 NEW_FUNC[2/5]: 0x19942a8 in nvme_ctrlr_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1216 00:08:26.886 #8 NEW cov: 12367 ft: 13172 corp: 3/32b lim: 35 exec/s: 0 rss: 73Mb L: 22/22 MS: 4 ChangeBit-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:26.886 [2024-11-28 12:38:56.911414] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.911440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.886 #9 NEW cov: 12373 ft: 13388 corp: 4/41b lim: 35 exec/s: 0 rss: 73Mb L: 9/22 MS: 1 CrossOver- 00:08:26.886 [2024-11-28 12:38:56.971703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.971729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.886 [2024-11-28 12:38:56.971808] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.971822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.886 [2024-11-28 12:38:56.971888] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.886 [2024-11-28 12:38:56.971902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.145 #10 NEW cov: 12458 ft: 13740 corp: 5/63b lim: 35 exec/s: 0 rss: 74Mb L: 22/22 MS: 1 ChangeBinInt- 00:08:27.145 [2024-11-28 12:38:57.031612] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.031638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.145 [2024-11-28 12:38:57.031704] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.031719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.145 #11 NEW cov: 12458 ft: 14046 corp: 6/81b lim: 35 exec/s: 0 rss: 74Mb L: 18/22 MS: 1 EraseBytes- 00:08:27.145 NEW_FUNC[1/1]: 0x494398 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:27.145 #12 NEW cov: 12472 ft: 14195 corp: 7/90b lim: 35 exec/s: 0 rss: 74Mb L: 9/22 MS: 1 ShuffleBytes- 00:08:27.145 [2024-11-28 12:38:57.131838] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.131866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.145 [2024-11-28 12:38:57.131931] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.131945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.145 [2024-11-28 12:38:57.132009] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.132023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.145 #13 NEW cov: 12472 ft: 14294 corp: 8/112b lim: 35 exec/s: 0 rss: 74Mb L: 22/22 MS: 1 CMP- DE: "\000\000\000\000"- 00:08:27.145 [2024-11-28 12:38:57.171798] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.171825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.145 [2024-11-28 12:38:57.171889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.171904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.145 [2024-11-28 12:38:57.171969] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.171983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.145 #14 NEW cov: 12472 ft: 14387 corp: 9/134b lim: 35 exec/s: 0 rss: 74Mb L: 22/22 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:27.145 [2024-11-28 12:38:57.231535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.145 [2024-11-28 12:38:57.231561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.145 #15 NEW cov: 12472 ft: 14460 corp: 10/143b lim: 35 exec/s: 0 rss: 74Mb L: 9/22 MS: 1 ShuffleBytes- 00:08:27.403 [2024-11-28 12:38:57.271554] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.403 [2024-11-28 12:38:57.271581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.403 #16 NEW cov: 12472 ft: 14517 corp: 11/152b lim: 35 exec/s: 0 rss: 74Mb L: 9/22 MS: 1 ChangeBinInt- 00:08:27.403 [2024-11-28 12:38:57.311597] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.403 [2024-11-28 12:38:57.311632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.403 #17 NEW cov: 12472 ft: 14586 corp: 12/165b lim: 35 exec/s: 0 rss: 74Mb L: 13/22 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:27.403 [2024-11-28 12:38:57.351613] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.403 [2024-11-28 12:38:57.351648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.403 #18 NEW cov: 12472 ft: 14619 corp: 13/175b lim: 35 exec/s: 0 rss: 74Mb L: 10/22 MS: 1 InsertByte- 00:08:27.403 [2024-11-28 12:38:57.391909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.391934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.404 [2024-11-28 12:38:57.392014] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.392029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.404 [2024-11-28 12:38:57.392088] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.392102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.404 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:27.404 #19 NEW cov: 12495 ft: 14639 corp: 14/197b lim: 35 exec/s: 0 rss: 74Mb L: 22/22 MS: 1 ChangeByte- 00:08:27.404 [2024-11-28 12:38:57.452072] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.452097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.404 [2024-11-28 12:38:57.452178] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.452192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.404 [2024-11-28 12:38:57.452255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.452269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.404 [2024-11-28 12:38:57.452334] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.452347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.404 #20 NEW cov: 12495 ft: 15123 corp: 15/227b lim: 35 exec/s: 0 rss: 74Mb L: 30/30 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:27.404 [2024-11-28 12:38:57.491655] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.404 [2024-11-28 12:38:57.491680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.404 #21 NEW cov: 12495 ft: 15132 corp: 16/240b lim: 35 exec/s: 0 rss: 74Mb L: 13/30 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:08:27.663 [2024-11-28 12:38:57.532121] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.532147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.663 [2024-11-28 12:38:57.532213] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.532228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.663 [2024-11-28 12:38:57.532292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.532306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.663 [2024-11-28 12:38:57.532366] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.532380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:27.663 #22 NEW cov: 12495 ft: 15195 corp: 17/270b lim: 35 exec/s: 22 rss: 74Mb L: 30/30 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:27.663 [2024-11-28 12:38:57.591683] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.591707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.663 #23 NEW cov: 12495 ft: 15211 corp: 18/283b lim: 35 exec/s: 23 rss: 74Mb L: 13/30 MS: 1 EraseBytes- 00:08:27.663 #24 NEW cov: 12495 ft: 15266 corp: 19/292b lim: 35 exec/s: 24 rss: 74Mb L: 9/30 MS: 1 ChangeBinInt- 00:08:27.663 #25 NEW cov: 12495 ft: 15277 corp: 20/301b lim: 35 exec/s: 25 rss: 74Mb L: 9/30 MS: 1 CrossOver- 00:08:27.663 [2024-11-28 12:38:57.771779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.663 [2024-11-28 12:38:57.771805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.922 #26 NEW cov: 12495 ft: 15284 corp: 21/310b lim: 35 exec/s: 26 rss: 74Mb L: 9/30 MS: 1 EraseBytes- 00:08:27.922 NEW_FUNC[1/1]: 0x48d828 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:27.922 #27 NEW cov: 12533 ft: 15396 corp: 22/323b lim: 35 exec/s: 27 rss: 74Mb L: 13/30 MS: 1 CMP- DE: "\001\000\000\020"- 00:08:27.922 [2024-11-28 12:38:57.871869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.922 [2024-11-28 12:38:57.871896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.923 #28 NEW cov: 12533 ft: 15401 corp: 23/336b lim: 35 exec/s: 28 rss: 74Mb L: 13/30 MS: 1 ShuffleBytes- 00:08:27.923 [2024-11-28 12:38:57.931921] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:57.931945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.923 #29 NEW cov: 12533 ft: 15413 corp: 24/349b lim: 35 exec/s: 29 rss: 74Mb L: 13/30 MS: 1 ChangeByte- 00:08:27.923 [2024-11-28 12:38:57.992191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000025 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:57.992216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.923 [2024-11-28 12:38:57.992294] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:57.992310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.923 [2024-11-28 12:38:57.992370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:57.992387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:27.923 #30 NEW cov: 12533 ft: 15433 corp: 25/371b lim: 35 exec/s: 30 rss: 74Mb L: 22/30 MS: 1 ShuffleBytes- 00:08:27.923 [2024-11-28 12:38:58.032191] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:58.032216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.923 [2024-11-28 12:38:58.032296] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:58.032311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.923 [2024-11-28 12:38:58.032372] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:27.923 [2024-11-28 12:38:58.032387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.182 #31 NEW cov: 12533 ft: 15482 corp: 26/393b lim: 35 exec/s: 31 rss: 74Mb L: 22/30 MS: 1 ChangeBit- 00:08:28.182 [2024-11-28 12:38:58.091944] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.182 [2024-11-28 12:38:58.091970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.183 #32 NEW cov: 12533 ft: 15508 corp: 27/402b lim: 35 exec/s: 32 rss: 74Mb L: 9/30 MS: 1 EraseBytes- 00:08:28.183 [2024-11-28 12:38:58.151994] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.152019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.183 #33 NEW cov: 12533 ft: 15525 corp: 28/411b lim: 35 exec/s: 33 rss: 74Mb L: 9/30 MS: 1 ShuffleBytes- 00:08:28.183 [2024-11-28 12:38:58.212325] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.212351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.183 [2024-11-28 12:38:58.212417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.212432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.183 [2024-11-28 12:38:58.212501] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.212515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.183 #34 NEW cov: 12533 ft: 15587 corp: 29/436b lim: 35 exec/s: 34 rss: 74Mb L: 25/30 MS: 1 CopyPart- 00:08:28.183 [2024-11-28 12:38:58.252299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.252323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.183 [2024-11-28 12:38:58.252388] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000003f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.252401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.183 [2024-11-28 12:38:58.252464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.183 [2024-11-28 12:38:58.252485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.183 #35 NEW cov: 12533 ft: 15622 corp: 30/461b lim: 35 exec/s: 35 rss: 74Mb L: 25/30 MS: 1 ChangeByte- 00:08:28.443 [2024-11-28 12:38:58.312094] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.312120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 #36 NEW cov: 12533 ft: 15631 corp: 31/468b lim: 35 exec/s: 36 rss: 74Mb L: 7/30 MS: 1 EraseBytes- 00:08:28.443 [2024-11-28 12:38:58.352185] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.352209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.352287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.352302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 #37 NEW cov: 12533 ft: 15654 corp: 32/486b lim: 35 exec/s: 37 rss: 74Mb L: 18/30 MS: 1 ShuffleBytes- 00:08:28.443 [2024-11-28 12:38:58.392061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.392085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 #38 NEW cov: 12533 ft: 15669 corp: 33/499b lim: 35 exec/s: 38 rss: 75Mb L: 13/30 MS: 1 ChangeBinInt- 00:08:28.443 [2024-11-28 12:38:58.432353] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.432379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.432447] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.432461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.432528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.432542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.443 #39 NEW cov: 12533 ft: 15741 corp: 34/521b lim: 35 exec/s: 39 rss: 75Mb L: 22/30 MS: 1 ChangeBit- 00:08:28.443 [2024-11-28 12:38:58.472122] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.472148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 #40 NEW cov: 12533 ft: 15784 corp: 35/534b lim: 35 exec/s: 40 rss: 75Mb L: 13/30 MS: 1 PersAutoDict- DE: "\001\000\000\020"- 00:08:28.443 [2024-11-28 12:38:58.512607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000725 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.512635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.512719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.512736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.512804] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.512824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.443 [2024-11-28 12:38:58.512893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:28.443 [2024-11-28 12:38:58.512909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.443 #46 NEW cov: 12533 ft: 15786 corp: 36/564b lim: 35 exec/s: 23 rss: 75Mb L: 30/30 MS: 1 CopyPart- 00:08:28.443 #46 DONE cov: 12533 ft: 15786 corp: 36/564b lim: 35 exec/s: 23 rss: 75Mb 00:08:28.443 ###### Recommended dictionary. ###### 00:08:28.443 "\000\000\000\000" # Uses: 3 00:08:28.443 "\377\377\377\377\377\377\377\377" # Uses: 1 00:08:28.443 "\001\000\000\020" # Uses: 1 00:08:28.443 ###### End of recommended dictionary. ###### 00:08:28.443 Done 46 runs in 2 second(s) 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 16 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4416 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:28.704 12:38:58 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:28.704 [2024-11-28 12:38:58.707510] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:28.704 [2024-11-28 12:38:58.707601] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid611781 ] 00:08:28.963 [2024-11-28 12:38:59.029112] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:28.963 [2024-11-28 12:38:59.074416] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.223 [2024-11-28 12:38:59.095325] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.223 [2024-11-28 12:38:59.147938] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.223 [2024-11-28 12:38:59.164043] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:29.223 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.223 INFO: Seed: 768241686 00:08:29.223 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:29.223 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:29.223 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:29.223 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.223 #2 INITED exec/s: 0 rss: 66Mb 00:08:29.223 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.223 This may also happen if the target rejected all inputs we tried so far 00:08:29.223 [2024-11-28 12:38:59.208962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.223 [2024-11-28 12:38:59.208996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.223 [2024-11-28 12:38:59.209031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.223 [2024-11-28 12:38:59.209049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.223 [2024-11-28 12:38:59.209080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.223 [2024-11-28 12:38:59.209097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.223 [2024-11-28 12:38:59.209127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.223 [2024-11-28 12:38:59.209143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.483 NEW_FUNC[1/717]: 0x475848 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:29.483 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:29.483 #14 NEW cov: 12349 ft: 12348 corp: 2/99b lim: 105 exec/s: 0 rss: 73Mb L: 98/98 MS: 2 CopyPart-InsertRepeatedBytes- 00:08:29.483 [2024-11-28 12:38:59.559003] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.483 [2024-11-28 12:38:59.559046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.483 [2024-11-28 12:38:59.559081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.483 [2024-11-28 12:38:59.559100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.483 [2024-11-28 12:38:59.559130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.483 [2024-11-28 12:38:59.559147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.483 [2024-11-28 12:38:59.559177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.483 [2024-11-28 12:38:59.559193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.743 #20 NEW cov: 12471 ft: 12859 corp: 3/197b lim: 105 exec/s: 0 rss: 74Mb L: 98/98 MS: 1 ChangeBinInt- 00:08:29.743 [2024-11-28 12:38:59.648870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.743 [2024-11-28 12:38:59.648901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.743 [2024-11-28 12:38:59.648948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.743 [2024-11-28 12:38:59.648967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.743 [2024-11-28 12:38:59.648998] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.743 [2024-11-28 12:38:59.649015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.743 [2024-11-28 12:38:59.649044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.743 [2024-11-28 12:38:59.649061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.743 #21 NEW cov: 12477 ft: 13127 corp: 4/301b lim: 105 exec/s: 0 rss: 74Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:29.743 [2024-11-28 12:38:59.738908] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.738939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.738971] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.738988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.739017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.739032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.739060] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.739075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:29.744 #22 NEW cov: 12562 ft: 13425 corp: 5/399b lim: 105 exec/s: 0 rss: 74Mb L: 98/104 MS: 1 CrossOver- 00:08:29.744 [2024-11-28 12:38:59.798913] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.798947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.798996] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.799014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.799045] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.799061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:29.744 [2024-11-28 12:38:59.799091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.744 [2024-11-28 12:38:59.799112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.004 #23 NEW cov: 12562 ft: 13636 corp: 6/503b lim: 105 exec/s: 0 rss: 74Mb L: 104/104 MS: 1 ChangeByte- 00:08:30.004 [2024-11-28 12:38:59.888899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.888930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:38:59.888977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.888995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:38:59.889026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.889042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:38:59.889071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.889087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.004 #24 NEW cov: 12562 ft: 13682 corp: 7/607b lim: 105 exec/s: 0 rss: 74Mb L: 104/104 MS: 1 ShuffleBytes- 00:08:30.004 [2024-11-28 12:38:59.948838] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12370169556722088458 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.948870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:38:59.948905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12370169555311111083 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:38:59.948922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.004 #29 NEW cov: 12562 ft: 14379 corp: 8/664b lim: 105 exec/s: 0 rss: 74Mb L: 57/104 MS: 5 CopyPart-InsertByte-InsertByte-ChangeByte-InsertRepeatedBytes- 00:08:30.004 [2024-11-28 12:39:00.008961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.008994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:39:00.009058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.009093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:39:00.009129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.009147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.004 #30 NEW cov: 12562 ft: 14712 corp: 9/739b lim: 105 exec/s: 0 rss: 74Mb L: 75/104 MS: 1 EraseBytes- 00:08:30.004 [2024-11-28 12:39:00.079085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.079133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:39:00.079170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.079190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:39:00.079221] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.079238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.004 [2024-11-28 12:39:00.079269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.004 [2024-11-28 12:39:00.079286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.264 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:30.264 #31 NEW cov: 12579 ft: 14855 corp: 10/843b lim: 105 exec/s: 0 rss: 74Mb L: 104/104 MS: 1 CopyPart- 00:08:30.264 [2024-11-28 12:39:00.178960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12370169556722088458 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.178998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.179035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12370169555311111083 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.179053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.264 #32 NEW cov: 12579 ft: 14904 corp: 11/900b lim: 105 exec/s: 32 rss: 74Mb L: 57/104 MS: 1 ShuffleBytes- 00:08:30.264 [2024-11-28 12:39:00.279043] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.279077] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.279127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.279146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.279178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.279195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.279226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.279243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.264 #33 NEW cov: 12579 ft: 14928 corp: 12/1000b lim: 105 exec/s: 33 rss: 74Mb L: 100/104 MS: 1 CMP- DE: "\030\000"- 00:08:30.264 [2024-11-28 12:39:00.339064] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.339097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.339152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.339171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.339202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.339219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.264 [2024-11-28 12:39:00.339249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.264 [2024-11-28 12:39:00.339266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.264 #34 NEW cov: 12579 ft: 14948 corp: 13/1104b lim: 105 exec/s: 34 rss: 74Mb L: 104/104 MS: 1 ShuffleBytes- 00:08:30.523 [2024-11-28 12:39:00.399065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.399096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.399143] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.399161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.399192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.399208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.399237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.399254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.523 #35 NEW cov: 12579 ft: 15055 corp: 14/1206b lim: 105 exec/s: 35 rss: 74Mb L: 102/104 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:30.523 [2024-11-28 12:39:00.489042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.489073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.489122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.489140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.489172] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.489189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.523 #36 NEW cov: 12579 ft: 15122 corp: 15/1281b lim: 105 exec/s: 36 rss: 74Mb L: 75/104 MS: 1 ChangeBit- 00:08:30.523 [2024-11-28 12:39:00.589110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.589144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.589182] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709486079 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.589217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.589250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.589269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.523 [2024-11-28 12:39:00.589300] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.523 [2024-11-28 12:39:00.589317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.523 #42 NEW cov: 12579 ft: 15163 corp: 16/1379b lim: 105 exec/s: 42 rss: 74Mb L: 98/104 MS: 1 ChangeBinInt- 00:08:30.784 [2024-11-28 12:39:00.649158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.649193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.649227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.649246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.649278] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.649295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.649325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.649342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.784 #43 NEW cov: 12579 ft: 15179 corp: 17/1481b lim: 105 exec/s: 43 rss: 74Mb L: 102/104 MS: 1 ShuffleBytes- 00:08:30.784 [2024-11-28 12:39:00.749111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:12370169556722088458 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.749147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.749183] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:12370169555311111083 len:43948 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.749202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.784 #44 NEW cov: 12579 ft: 15242 corp: 18/1538b lim: 105 exec/s: 44 rss: 75Mb L: 57/104 MS: 1 ChangeBinInt- 00:08:30.784 [2024-11-28 12:39:00.839156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.839190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.839238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.839260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.839291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.839307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.839337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073701097471 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.839353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:30.784 #45 NEW cov: 12579 ft: 15261 corp: 19/1637b lim: 105 exec/s: 45 rss: 75Mb L: 99/104 MS: 1 InsertByte- 00:08:30.784 [2024-11-28 12:39:00.899159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.899189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.899237] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.899255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.899286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.899302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:30.784 [2024-11-28 12:39:00.899332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.784 [2024-11-28 12:39:00.899348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.045 #46 NEW cov: 12579 ft: 15282 corp: 20/1735b lim: 105 exec/s: 46 rss: 75Mb L: 98/104 MS: 1 ShuffleBytes- 00:08:31.045 [2024-11-28 12:39:00.949169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.949199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.949246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.949264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.949295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.949311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.949340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.949356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.045 #47 NEW cov: 12579 ft: 15300 corp: 21/1834b lim: 105 exec/s: 47 rss: 75Mb L: 99/104 MS: 1 InsertByte- 00:08:31.045 [2024-11-28 12:39:00.999178] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.999212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.999262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073708961791 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.999280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.999311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.999327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.045 [2024-11-28 12:39:00.999356] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:00.999373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.045 #53 NEW cov: 12579 ft: 15309 corp: 22/1937b lim: 105 exec/s: 53 rss: 75Mb L: 103/104 MS: 1 CopyPart- 00:08:31.045 [2024-11-28 12:39:01.089068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.045 [2024-11-28 12:39:01.089098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.045 #54 NEW cov: 12586 ft: 15779 corp: 23/1976b lim: 105 exec/s: 54 rss: 75Mb L: 39/104 MS: 1 CrossOver- 00:08:31.304 [2024-11-28 12:39:01.179225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.304 [2024-11-28 12:39:01.179255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.304 [2024-11-28 12:39:01.179301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.304 [2024-11-28 12:39:01.179320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:31.304 [2024-11-28 12:39:01.179351] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744035054845951 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.304 [2024-11-28 12:39:01.179368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:31.304 [2024-11-28 12:39:01.179397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.304 [2024-11-28 12:39:01.179413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:31.304 #55 NEW cov: 12586 ft: 15793 corp: 24/2078b lim: 105 exec/s: 27 rss: 75Mb L: 102/104 MS: 1 CopyPart- 00:08:31.304 #55 DONE cov: 12586 ft: 15793 corp: 24/2078b lim: 105 exec/s: 27 rss: 75Mb 00:08:31.304 ###### Recommended dictionary. ###### 00:08:31.304 "\030\000" # Uses: 0 00:08:31.304 "\377\377\377\377" # Uses: 0 00:08:31.304 ###### End of recommended dictionary. ###### 00:08:31.304 Done 55 runs in 2 second(s) 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 17 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4417 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:31.304 12:39:01 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:31.304 [2024-11-28 12:39:01.365635] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:31.304 [2024-11-28 12:39:01.365708] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid612224 ] 00:08:31.563 [2024-11-28 12:39:01.686054] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:31.823 [2024-11-28 12:39:01.734066] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.823 [2024-11-28 12:39:01.750938] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.823 [2024-11-28 12:39:01.803880] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:31.823 [2024-11-28 12:39:01.820000] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:31.823 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.823 INFO: Seed: 3424225498 00:08:31.823 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:31.823 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:31.823 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:31.823 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.823 #2 INITED exec/s: 0 rss: 66Mb 00:08:31.823 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.823 This may also happen if the target rejected all inputs we tried so far 00:08:31.823 [2024-11-28 12:39:01.886457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.823 [2024-11-28 12:39:01.886511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:31.823 [2024-11-28 12:39:01.886639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:31.823 [2024-11-28 12:39:01.886666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.082 NEW_FUNC[1/718]: 0x478bc8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:32.082 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.082 #5 NEW cov: 12379 ft: 12380 corp: 2/56b lim: 120 exec/s: 0 rss: 73Mb L: 55/55 MS: 3 ChangeBinInt-CopyPart-InsertRepeatedBytes- 00:08:32.342 [2024-11-28 12:39:02.237036] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.237088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.342 [2024-11-28 12:39:02.237202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.237227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.342 #6 NEW cov: 12492 ft: 13066 corp: 3/111b lim: 120 exec/s: 0 rss: 73Mb L: 55/55 MS: 1 ChangeBinInt- 00:08:32.342 [2024-11-28 12:39:02.307013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.307044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.342 [2024-11-28 12:39:02.307112] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.307131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.342 #9 NEW cov: 12498 ft: 13284 corp: 4/160b lim: 120 exec/s: 0 rss: 73Mb L: 49/55 MS: 3 ChangeBinInt-ChangeByte-InsertRepeatedBytes- 00:08:32.342 [2024-11-28 12:39:02.357173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.357203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.342 [2024-11-28 12:39:02.357261] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.357277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.342 #10 NEW cov: 12583 ft: 13566 corp: 5/230b lim: 120 exec/s: 0 rss: 73Mb L: 70/70 MS: 1 CopyPart- 00:08:32.342 [2024-11-28 12:39:02.407166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.407195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.342 [2024-11-28 12:39:02.407297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.342 [2024-11-28 12:39:02.407313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.342 #16 NEW cov: 12583 ft: 13734 corp: 6/300b lim: 120 exec/s: 0 rss: 73Mb L: 70/70 MS: 1 ChangeBinInt- 00:08:32.602 [2024-11-28 12:39:02.477403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:262 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.477434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.602 [2024-11-28 12:39:02.477498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.477517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.602 #17 NEW cov: 12583 ft: 13795 corp: 7/355b lim: 120 exec/s: 0 rss: 73Mb L: 55/70 MS: 1 CMP- DE: "\001\006"- 00:08:32.602 [2024-11-28 12:39:02.527424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.527453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.602 [2024-11-28 12:39:02.527564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.527582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.602 #20 NEW cov: 12583 ft: 13844 corp: 8/403b lim: 120 exec/s: 0 rss: 73Mb L: 48/70 MS: 3 InsertByte-ChangeBit-CrossOver- 00:08:32.602 [2024-11-28 12:39:02.577481] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.577513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.602 [2024-11-28 12:39:02.577584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.577604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.602 #21 NEW cov: 12583 ft: 13910 corp: 9/474b lim: 120 exec/s: 0 rss: 74Mb L: 71/71 MS: 1 CopyPart- 00:08:32.602 [2024-11-28 12:39:02.647534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14612714910544429514 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.647566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.602 [2024-11-28 12:39:02.647669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.647690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.602 #26 NEW cov: 12583 ft: 13930 corp: 10/545b lim: 120 exec/s: 0 rss: 74Mb L: 71/71 MS: 5 InsertByte-CopyPart-PersAutoDict-ChangeBit-InsertRepeatedBytes- DE: "\001\006"- 00:08:32.602 [2024-11-28 12:39:02.697655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.697686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.602 [2024-11-28 12:39:02.697775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.602 [2024-11-28 12:39:02.697790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.602 #27 NEW cov: 12583 ft: 13976 corp: 11/615b lim: 120 exec/s: 0 rss: 74Mb L: 70/71 MS: 1 ShuffleBytes- 00:08:32.862 [2024-11-28 12:39:02.747439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.747475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.862 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:32.862 #28 NEW cov: 12606 ft: 14848 corp: 12/655b lim: 120 exec/s: 0 rss: 74Mb L: 40/71 MS: 1 EraseBytes- 00:08:32.862 [2024-11-28 12:39:02.827900] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.827935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.862 [2024-11-28 12:39:02.828021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.828037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.862 #29 NEW cov: 12606 ft: 14870 corp: 13/726b lim: 120 exec/s: 29 rss: 74Mb L: 71/71 MS: 1 CopyPart- 00:08:32.862 [2024-11-28 12:39:02.898677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.898709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.862 [2024-11-28 12:39:02.898787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.898809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.862 [2024-11-28 12:39:02.898870] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:73746445228769280 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.898887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.862 [2024-11-28 12:39:02.898978] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.862 [2024-11-28 12:39:02.898997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.862 #30 NEW cov: 12606 ft: 15308 corp: 14/845b lim: 120 exec/s: 30 rss: 74Mb L: 119/119 MS: 1 CrossOver- 00:08:32.863 [2024-11-28 12:39:02.957928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.863 [2024-11-28 12:39:02.957957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.863 [2024-11-28 12:39:02.958024] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:32.863 [2024-11-28 12:39:02.958043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.863 #31 NEW cov: 12606 ft: 15338 corp: 15/915b lim: 120 exec/s: 31 rss: 74Mb L: 70/119 MS: 1 ChangeByte- 00:08:33.123 [2024-11-28 12:39:03.008468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14612714910544429514 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.008502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.008577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.008597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.008673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.008695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.123 #32 NEW cov: 12606 ft: 15638 corp: 16/1007b lim: 120 exec/s: 32 rss: 74Mb L: 92/119 MS: 1 CrossOver- 00:08:33.123 [2024-11-28 12:39:03.078209] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.078245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.078343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.078364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.123 #33 NEW cov: 12606 ft: 15699 corp: 17/1078b lim: 120 exec/s: 33 rss: 74Mb L: 71/119 MS: 1 CopyPart- 00:08:33.123 [2024-11-28 12:39:03.148271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.148302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.148403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.148421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.123 #34 NEW cov: 12606 ft: 15704 corp: 18/1127b lim: 120 exec/s: 34 rss: 74Mb L: 49/119 MS: 1 ChangeBinInt- 00:08:33.123 [2024-11-28 12:39:03.198285] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14612714910544429514 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.198315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.198374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.198392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.123 #35 NEW cov: 12606 ft: 15735 corp: 19/1198b lim: 120 exec/s: 35 rss: 74Mb L: 71/119 MS: 1 ChangeBit- 00:08:33.123 [2024-11-28 12:39:03.248277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.248307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.123 [2024-11-28 12:39:03.248367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.123 [2024-11-28 12:39:03.248390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.382 #36 NEW cov: 12606 ft: 15744 corp: 20/1263b lim: 120 exec/s: 36 rss: 74Mb L: 65/119 MS: 1 EraseBytes- 00:08:33.383 [2024-11-28 12:39:03.298516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.298544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.298654] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.298672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.383 #37 NEW cov: 12606 ft: 15751 corp: 21/1334b lim: 120 exec/s: 37 rss: 74Mb L: 71/119 MS: 1 CrossOver- 00:08:33.383 [2024-11-28 12:39:03.348574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:14612714910544429514 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.348603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.348673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:14612714913291487946 len:51915 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.348690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.383 #38 NEW cov: 12606 ft: 15778 corp: 22/1405b lim: 120 exec/s: 38 rss: 74Mb L: 71/119 MS: 1 CrossOver- 00:08:33.383 [2024-11-28 12:39:03.399368] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.399396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.399487] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:8029759185026510703 len:28528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.399504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.399569] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:8029759185026510703 len:28528 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.399586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.399679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:8029759185026510703 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.399696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.383 #39 NEW cov: 12606 ft: 15789 corp: 23/1511b lim: 120 exec/s: 39 rss: 74Mb L: 106/119 MS: 1 InsertRepeatedBytes- 00:08:33.383 [2024-11-28 12:39:03.459160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.459191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.459275] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.459295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.383 [2024-11-28 12:39:03.459376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.383 [2024-11-28 12:39:03.459394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.383 #40 NEW cov: 12606 ft: 15798 corp: 24/1603b lim: 120 exec/s: 40 rss: 74Mb L: 92/119 MS: 1 CrossOver- 00:08:33.643 [2024-11-28 12:39:03.509332] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.509362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.643 [2024-11-28 12:39:03.509432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:3 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.509448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.643 [2024-11-28 12:39:03.509539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.509557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.643 #41 NEW cov: 12606 ft: 15877 corp: 25/1695b lim: 120 exec/s: 41 rss: 74Mb L: 92/119 MS: 1 ChangeBit- 00:08:33.643 [2024-11-28 12:39:03.578995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:454669289439317761 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.579024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.643 [2024-11-28 12:39:03.579090] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:5714873654208057167 len:20304 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.579106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.643 #42 NEW cov: 12606 ft: 15891 corp: 26/1746b lim: 120 exec/s: 42 rss: 74Mb L: 51/119 MS: 1 PersAutoDict- DE: "\001\006"- 00:08:33.643 [2024-11-28 12:39:03.649146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.649177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.643 [2024-11-28 12:39:03.649244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:176093659136 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.649264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.643 #43 NEW cov: 12606 ft: 15901 corp: 27/1816b lim: 120 exec/s: 43 rss: 74Mb L: 70/119 MS: 1 ChangeByte- 00:08:33.643 [2024-11-28 12:39:03.719273] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.719301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.643 [2024-11-28 12:39:03.719367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.643 [2024-11-28 12:39:03.719385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.643 #44 NEW cov: 12606 ft: 15946 corp: 28/1886b lim: 120 exec/s: 44 rss: 74Mb L: 70/119 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:33.903 [2024-11-28 12:39:03.789537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.903 [2024-11-28 12:39:03.789567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.903 [2024-11-28 12:39:03.789618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:176093659136 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.903 [2024-11-28 12:39:03.789636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.903 #50 NEW cov: 12606 ft: 15955 corp: 29/1956b lim: 120 exec/s: 50 rss: 74Mb L: 70/119 MS: 1 ChangeBit- 00:08:33.903 [2024-11-28 12:39:03.859883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.903 [2024-11-28 12:39:03.859915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.903 [2024-11-28 12:39:03.860009] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:33.903 [2024-11-28 12:39:03.860025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.903 #51 NEW cov: 12606 ft: 15967 corp: 30/2026b lim: 120 exec/s: 25 rss: 74Mb L: 70/119 MS: 1 ShuffleBytes- 00:08:33.903 #51 DONE cov: 12606 ft: 15967 corp: 30/2026b lim: 120 exec/s: 25 rss: 74Mb 00:08:33.903 ###### Recommended dictionary. ###### 00:08:33.903 "\001\006" # Uses: 2 00:08:33.903 "\377\377\377\377\377\377\377\377" # Uses: 0 00:08:33.903 ###### End of recommended dictionary. ###### 00:08:33.903 Done 51 runs in 2 second(s) 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 18 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4418 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:33.903 12:39:03 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:33.903 12:39:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:33.903 12:39:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:33.904 12:39:04 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:34.163 [2024-11-28 12:39:04.032860] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:34.163 [2024-11-28 12:39:04.032937] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid612697 ] 00:08:34.423 [2024-11-28 12:39:04.359976] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:34.423 [2024-11-28 12:39:04.407318] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.423 [2024-11-28 12:39:04.423936] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.423 [2024-11-28 12:39:04.476503] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:34.423 [2024-11-28 12:39:04.492624] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:34.423 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.423 INFO: Seed: 1801260489 00:08:34.423 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:34.423 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:34.423 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:34.423 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.423 #2 INITED exec/s: 0 rss: 66Mb 00:08:34.423 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.423 This may also happen if the target rejected all inputs we tried so far 00:08:34.683 [2024-11-28 12:39:04.559159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:34.683 [2024-11-28 12:39:04.559208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.683 [2024-11-28 12:39:04.559318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:34.683 [2024-11-28 12:39:04.559338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.683 [2024-11-28 12:39:04.559441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:34.683 [2024-11-28 12:39:04.559463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.942 NEW_FUNC[1/716]: 0x47c4b8 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:34.942 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:34.942 #6 NEW cov: 12322 ft: 12323 corp: 2/72b lim: 100 exec/s: 0 rss: 73Mb L: 71/71 MS: 4 ChangeBit-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:34.942 [2024-11-28 12:39:04.920408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:34.942 [2024-11-28 12:39:04.920451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.920543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:34.942 [2024-11-28 12:39:04.920560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.920648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:34.942 [2024-11-28 12:39:04.920663] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.920754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:34.942 [2024-11-28 12:39:04.920771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.920853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:34.942 [2024-11-28 12:39:04.920872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:34.942 #11 NEW cov: 12435 ft: 13346 corp: 3/172b lim: 100 exec/s: 0 rss: 73Mb L: 100/100 MS: 5 ChangeBit-ShuffleBytes-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:34.942 [2024-11-28 12:39:04.980295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:34.942 [2024-11-28 12:39:04.980329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.980396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:34.942 [2024-11-28 12:39:04.980414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.980467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:34.942 [2024-11-28 12:39:04.980487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:04.980584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:34.942 [2024-11-28 12:39:04.980604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.942 #13 NEW cov: 12441 ft: 13529 corp: 4/267b lim: 100 exec/s: 0 rss: 73Mb L: 95/100 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:34.942 [2024-11-28 12:39:05.030223] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:34.942 [2024-11-28 12:39:05.030259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:05.030315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:34.942 [2024-11-28 12:39:05.030345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.942 [2024-11-28 12:39:05.030403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:34.942 [2024-11-28 12:39:05.030418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.201 #14 NEW cov: 12526 ft: 13751 corp: 5/338b lim: 100 exec/s: 0 rss: 73Mb L: 71/100 MS: 1 CrossOver- 00:08:35.201 [2024-11-28 12:39:05.100404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.201 [2024-11-28 12:39:05.100435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.100510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.201 [2024-11-28 12:39:05.100528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.100604] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.201 [2024-11-28 12:39:05.100621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.201 #15 NEW cov: 12526 ft: 13860 corp: 6/415b lim: 100 exec/s: 0 rss: 73Mb L: 77/100 MS: 1 CopyPart- 00:08:35.201 [2024-11-28 12:39:05.171317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.201 [2024-11-28 12:39:05.171346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.171437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.201 [2024-11-28 12:39:05.171454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.171545] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.201 [2024-11-28 12:39:05.171564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.171651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.201 [2024-11-28 12:39:05.171667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.171754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:35.201 [2024-11-28 12:39:05.171772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.201 #16 NEW cov: 12526 ft: 13952 corp: 7/515b lim: 100 exec/s: 0 rss: 73Mb L: 100/100 MS: 1 CrossOver- 00:08:35.201 [2024-11-28 12:39:05.251155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.201 [2024-11-28 12:39:05.251186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.251256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.201 [2024-11-28 12:39:05.251274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.251342] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.201 [2024-11-28 12:39:05.251356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.201 [2024-11-28 12:39:05.251449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.201 [2024-11-28 12:39:05.251477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.201 #17 NEW cov: 12526 ft: 13999 corp: 8/614b lim: 100 exec/s: 0 rss: 74Mb L: 99/100 MS: 1 CrossOver- 00:08:35.461 [2024-11-28 12:39:05.330878] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.461 [2024-11-28 12:39:05.330906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.330978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.461 [2024-11-28 12:39:05.330994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.331083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.461 [2024-11-28 12:39:05.331101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.461 #18 NEW cov: 12526 ft: 14109 corp: 9/691b lim: 100 exec/s: 0 rss: 74Mb L: 77/100 MS: 1 CrossOver- 00:08:35.461 [2024-11-28 12:39:05.401490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.461 [2024-11-28 12:39:05.401522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.401603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.461 [2024-11-28 12:39:05.401623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.401702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.461 [2024-11-28 12:39:05.401722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.401810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.461 [2024-11-28 12:39:05.401828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.401915] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:35.461 [2024-11-28 12:39:05.401935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:35.461 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:35.461 #19 NEW cov: 12549 ft: 14129 corp: 10/791b lim: 100 exec/s: 0 rss: 74Mb L: 100/100 MS: 1 CopyPart- 00:08:35.461 [2024-11-28 12:39:05.461627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.461 [2024-11-28 12:39:05.461655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.461736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.461 [2024-11-28 12:39:05.461752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.461844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.461 [2024-11-28 12:39:05.461863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.461972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.461 [2024-11-28 12:39:05.461991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.461 #20 NEW cov: 12549 ft: 14178 corp: 11/887b lim: 100 exec/s: 0 rss: 74Mb L: 96/100 MS: 1 InsertByte- 00:08:35.461 [2024-11-28 12:39:05.531687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.461 [2024-11-28 12:39:05.531727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.531785] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.461 [2024-11-28 12:39:05.531801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.531868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.461 [2024-11-28 12:39:05.531883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.461 [2024-11-28 12:39:05.531974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.461 [2024-11-28 12:39:05.531990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.461 #21 NEW cov: 12549 ft: 14192 corp: 12/982b lim: 100 exec/s: 21 rss: 74Mb L: 95/100 MS: 1 ShuffleBytes- 00:08:35.461 [2024-11-28 12:39:05.580909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.461 [2024-11-28 12:39:05.580936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 #22 NEW cov: 12549 ft: 14583 corp: 13/1015b lim: 100 exec/s: 22 rss: 74Mb L: 33/100 MS: 1 InsertRepeatedBytes- 00:08:35.750 [2024-11-28 12:39:05.631816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.750 [2024-11-28 12:39:05.631842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.631925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.750 [2024-11-28 12:39:05.631941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.632030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.750 [2024-11-28 12:39:05.632047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.632132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.750 [2024-11-28 12:39:05.632152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.750 #23 NEW cov: 12549 ft: 14630 corp: 14/1114b lim: 100 exec/s: 23 rss: 74Mb L: 99/100 MS: 1 ChangeBit- 00:08:35.750 [2024-11-28 12:39:05.702118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.750 [2024-11-28 12:39:05.702146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.702249] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.750 [2024-11-28 12:39:05.702268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.702354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.750 [2024-11-28 12:39:05.702374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.702468] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.750 [2024-11-28 12:39:05.702491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.750 #24 NEW cov: 12549 ft: 14678 corp: 15/1209b lim: 100 exec/s: 24 rss: 74Mb L: 95/100 MS: 1 InsertRepeatedBytes- 00:08:35.750 [2024-11-28 12:39:05.772605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.750 [2024-11-28 12:39:05.772631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.772711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.750 [2024-11-28 12:39:05.772729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.772813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.750 [2024-11-28 12:39:05.772830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.772924] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.750 [2024-11-28 12:39:05.772942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.750 #30 NEW cov: 12549 ft: 14711 corp: 16/1304b lim: 100 exec/s: 30 rss: 74Mb L: 95/100 MS: 1 ShuffleBytes- 00:08:35.750 [2024-11-28 12:39:05.842734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:35.750 [2024-11-28 12:39:05.842763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.842838] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:35.750 [2024-11-28 12:39:05.842855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.842941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:35.750 [2024-11-28 12:39:05.842958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.750 [2024-11-28 12:39:05.843054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:35.750 [2024-11-28 12:39:05.843071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.010 #31 NEW cov: 12549 ft: 14767 corp: 17/1400b lim: 100 exec/s: 31 rss: 74Mb L: 96/100 MS: 1 ChangeBinInt- 00:08:36.010 [2024-11-28 12:39:05.912960] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.010 [2024-11-28 12:39:05.912987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.913088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.010 [2024-11-28 12:39:05.913103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.913181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.010 [2024-11-28 12:39:05.913194] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.913283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.010 [2024-11-28 12:39:05.913303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.010 #32 NEW cov: 12549 ft: 14792 corp: 18/1486b lim: 100 exec/s: 32 rss: 74Mb L: 86/100 MS: 1 InsertRepeatedBytes- 00:08:36.010 [2024-11-28 12:39:05.963250] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.010 [2024-11-28 12:39:05.963274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.963410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.010 [2024-11-28 12:39:05.963426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.963524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.010 [2024-11-28 12:39:05.963538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.963627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.010 [2024-11-28 12:39:05.963645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:05.963733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:36.010 [2024-11-28 12:39:05.963751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.010 #33 NEW cov: 12549 ft: 14794 corp: 19/1586b lim: 100 exec/s: 33 rss: 74Mb L: 100/100 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:36.010 [2024-11-28 12:39:06.032820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.010 [2024-11-28 12:39:06.032849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:06.032934] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.010 [2024-11-28 12:39:06.032952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:06.033035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.010 [2024-11-28 12:39:06.033054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.010 #34 NEW cov: 12549 ft: 14832 corp: 20/1661b lim: 100 exec/s: 34 rss: 74Mb L: 75/100 MS: 1 EraseBytes- 00:08:36.010 [2024-11-28 12:39:06.083332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.010 [2024-11-28 12:39:06.083360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:06.083461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.010 [2024-11-28 12:39:06.083484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:06.083568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.010 [2024-11-28 12:39:06.083584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.010 [2024-11-28 12:39:06.083675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.010 [2024-11-28 12:39:06.083694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.010 #35 NEW cov: 12549 ft: 14847 corp: 21/1747b lim: 100 exec/s: 35 rss: 74Mb L: 86/100 MS: 1 CopyPart- 00:08:36.270 [2024-11-28 12:39:06.153832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.270 [2024-11-28 12:39:06.153864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.270 [2024-11-28 12:39:06.153951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.270 [2024-11-28 12:39:06.153968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.270 [2024-11-28 12:39:06.154050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.270 [2024-11-28 12:39:06.154063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.270 [2024-11-28 12:39:06.154150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.270 [2024-11-28 12:39:06.154166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.270 #36 NEW cov: 12549 ft: 14867 corp: 22/1842b lim: 100 exec/s: 36 rss: 74Mb L: 95/100 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:36.270 [2024-11-28 12:39:06.204286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.270 [2024-11-28 12:39:06.204314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.270 [2024-11-28 12:39:06.204403] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.270 [2024-11-28 12:39:06.204420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.270 [2024-11-28 12:39:06.204518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.271 [2024-11-28 12:39:06.204535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.204619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.271 [2024-11-28 12:39:06.204636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.204724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:36.271 [2024-11-28 12:39:06.204740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.271 #37 NEW cov: 12549 ft: 14880 corp: 23/1942b lim: 100 exec/s: 37 rss: 74Mb L: 100/100 MS: 1 InsertByte- 00:08:36.271 [2024-11-28 12:39:06.274323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.271 [2024-11-28 12:39:06.274350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.274440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.271 [2024-11-28 12:39:06.274458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.274555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.271 [2024-11-28 12:39:06.274571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.274657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.271 [2024-11-28 12:39:06.274672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.271 #38 NEW cov: 12549 ft: 14956 corp: 24/2036b lim: 100 exec/s: 38 rss: 75Mb L: 94/100 MS: 1 InsertRepeatedBytes- 00:08:36.271 [2024-11-28 12:39:06.344476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.271 [2024-11-28 12:39:06.344505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.344598] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.271 [2024-11-28 12:39:06.344618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.344690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.271 [2024-11-28 12:39:06.344707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.344798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.271 [2024-11-28 12:39:06.344815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.271 #39 NEW cov: 12549 ft: 14969 corp: 25/2134b lim: 100 exec/s: 39 rss: 75Mb L: 98/100 MS: 1 CrossOver- 00:08:36.271 [2024-11-28 12:39:06.394807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.271 [2024-11-28 12:39:06.394833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.394928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.271 [2024-11-28 12:39:06.394951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.395036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.271 [2024-11-28 12:39:06.395055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.271 [2024-11-28 12:39:06.395146] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.271 [2024-11-28 12:39:06.395164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.531 #40 NEW cov: 12549 ft: 15040 corp: 26/2215b lim: 100 exec/s: 40 rss: 75Mb L: 81/100 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:36.531 [2024-11-28 12:39:06.445294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.531 [2024-11-28 12:39:06.445321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.445428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.531 [2024-11-28 12:39:06.445444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.445537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.531 [2024-11-28 12:39:06.445550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.445634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.531 [2024-11-28 12:39:06.445649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.445736] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:36.531 [2024-11-28 12:39:06.445752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.531 #41 NEW cov: 12549 ft: 15072 corp: 27/2315b lim: 100 exec/s: 41 rss: 75Mb L: 100/100 MS: 1 CrossOver- 00:08:36.531 [2024-11-28 12:39:06.515265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:36.531 [2024-11-28 12:39:06.515298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.515381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:36.531 [2024-11-28 12:39:06.515400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.515459] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:36.531 [2024-11-28 12:39:06.515477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.531 [2024-11-28 12:39:06.515565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:36.531 [2024-11-28 12:39:06.515582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.531 #42 NEW cov: 12549 ft: 15075 corp: 28/2401b lim: 100 exec/s: 21 rss: 75Mb L: 86/100 MS: 1 CrossOver- 00:08:36.531 #42 DONE cov: 12549 ft: 15075 corp: 28/2401b lim: 100 exec/s: 21 rss: 75Mb 00:08:36.531 ###### Recommended dictionary. ###### 00:08:36.531 "\377\377\377\377" # Uses: 2 00:08:36.531 ###### End of recommended dictionary. ###### 00:08:36.531 Done 42 runs in 2 second(s) 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 19 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4419 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:36.531 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:36.791 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:36.791 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:36.791 12:39:06 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:36.791 [2024-11-28 12:39:06.686590] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:36.791 [2024-11-28 12:39:06.686659] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613347 ] 00:08:37.051 [2024-11-28 12:39:07.007080] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:37.051 [2024-11-28 12:39:07.053038] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.051 [2024-11-28 12:39:07.072765] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.051 [2024-11-28 12:39:07.125342] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:37.051 [2024-11-28 12:39:07.141457] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:37.051 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.051 INFO: Seed: 153282562 00:08:37.311 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:37.311 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:37.311 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:37.311 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.311 #2 INITED exec/s: 0 rss: 66Mb 00:08:37.311 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.311 This may also happen if the target rejected all inputs we tried so far 00:08:37.311 [2024-11-28 12:39:07.210680] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556944906 len:2757 00:08:37.311 [2024-11-28 12:39:07.210720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.571 NEW_FUNC[1/716]: 0x47f478 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:37.571 NEW_FUNC[2/716]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.571 #12 NEW cov: 12284 ft: 12282 corp: 2/11b lim: 50 exec/s: 0 rss: 74Mb L: 10/10 MS: 5 InsertByte-CrossOver-InsertByte-CopyPart-CopyPart- 00:08:37.571 [2024-11-28 12:39:07.551116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:775881418381986500 len:50187 00:08:37.571 [2024-11-28 12:39:07.551158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.571 #20 NEW cov: 12413 ft: 12920 corp: 3/22b lim: 50 exec/s: 0 rss: 74Mb L: 11/11 MS: 3 CopyPart-ShuffleBytes-CrossOver- 00:08:37.571 [2024-11-28 12:39:07.601600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:37.571 [2024-11-28 12:39:07.601627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.571 [2024-11-28 12:39:07.601685] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474942 len:65279 00:08:37.571 [2024-11-28 12:39:07.601702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.571 #21 NEW cov: 12419 ft: 13555 corp: 4/45b lim: 50 exec/s: 0 rss: 74Mb L: 23/23 MS: 1 InsertRepeatedBytes- 00:08:37.571 [2024-11-28 12:39:07.652222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556944906 len:2757 00:08:37.571 [2024-11-28 12:39:07.652249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.571 [2024-11-28 12:39:07.652335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2604246222170760228 len:9253 00:08:37.571 [2024-11-28 12:39:07.652353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.571 [2024-11-28 12:39:07.652433] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2604246222170760228 len:9253 00:08:37.571 [2024-11-28 12:39:07.652452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.571 [2024-11-28 12:39:07.652555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2604246222170760228 len:9253 00:08:37.571 [2024-11-28 12:39:07.652571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.571 #22 NEW cov: 12504 ft: 14065 corp: 5/93b lim: 50 exec/s: 0 rss: 74Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:37.831 [2024-11-28 12:39:07.722124] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:37.831 [2024-11-28 12:39:07.722152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.831 [2024-11-28 12:39:07.722238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474771 len:65279 00:08:37.832 [2024-11-28 12:39:07.722253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.832 #28 NEW cov: 12504 ft: 14219 corp: 6/116b lim: 50 exec/s: 0 rss: 74Mb L: 23/48 MS: 1 ChangeByte- 00:08:37.832 [2024-11-28 12:39:07.792557] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:7812738666512280684 len:27757 00:08:37.832 [2024-11-28 12:39:07.792585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.832 [2024-11-28 12:39:07.792649] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7812738666512280684 len:31755 00:08:37.832 [2024-11-28 12:39:07.792665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.832 #29 NEW cov: 12504 ft: 14273 corp: 7/144b lim: 50 exec/s: 0 rss: 74Mb L: 28/48 MS: 1 InsertRepeatedBytes- 00:08:37.832 [2024-11-28 12:39:07.842440] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900519153406 len:65279 00:08:37.832 [2024-11-28 12:39:07.842469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.832 #32 NEW cov: 12504 ft: 14348 corp: 8/156b lim: 50 exec/s: 0 rss: 74Mb L: 12/48 MS: 3 ShuffleBytes-ChangeByte-CrossOver- 00:08:37.832 [2024-11-28 12:39:07.892478] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:775882431994268356 len:15350 00:08:37.832 [2024-11-28 12:39:07.892507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.832 #33 NEW cov: 12504 ft: 14441 corp: 9/167b lim: 50 exec/s: 0 rss: 74Mb L: 11/48 MS: 1 ChangeBinInt- 00:08:38.091 [2024-11-28 12:39:07.962715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900519153406 len:65279 00:08:38.091 [2024-11-28 12:39:07.962745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.091 #34 NEW cov: 12504 ft: 14503 corp: 10/178b lim: 50 exec/s: 0 rss: 75Mb L: 11/48 MS: 1 EraseBytes- 00:08:38.091 [2024-11-28 12:39:08.032668] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556945034 len:2757 00:08:38.092 [2024-11-28 12:39:08.032700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.092 #35 NEW cov: 12504 ft: 14543 corp: 11/188b lim: 50 exec/s: 0 rss: 75Mb L: 10/48 MS: 1 ChangeBit- 00:08:38.092 [2024-11-28 12:39:08.083038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871434494 len:65279 00:08:38.092 [2024-11-28 12:39:08.083067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.092 [2024-11-28 12:39:08.083156] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474942 len:65279 00:08:38.092 [2024-11-28 12:39:08.083176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.092 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:38.092 #36 NEW cov: 12527 ft: 14614 corp: 12/212b lim: 50 exec/s: 0 rss: 75Mb L: 24/48 MS: 1 InsertByte- 00:08:38.092 [2024-11-28 12:39:08.133594] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4268070196624440123 len:15164 00:08:38.092 [2024-11-28 12:39:08.133621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.092 [2024-11-28 12:39:08.133698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4268070197446523707 len:15164 00:08:38.092 [2024-11-28 12:39:08.133714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.092 [2024-11-28 12:39:08.133808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:08:38.092 [2024-11-28 12:39:08.133824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.092 [2024-11-28 12:39:08.133911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4268070197446523707 len:15164 00:08:38.092 [2024-11-28 12:39:08.133930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.092 #39 NEW cov: 12527 ft: 14649 corp: 13/253b lim: 50 exec/s: 0 rss: 75Mb L: 41/48 MS: 3 ShuffleBytes-ShuffleBytes-InsertRepeatedBytes- 00:08:38.092 [2024-11-28 12:39:08.182912] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556945034 len:2757 00:08:38.092 [2024-11-28 12:39:08.182940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.351 #40 NEW cov: 12527 ft: 14748 corp: 14/263b lim: 50 exec/s: 40 rss: 75Mb L: 10/48 MS: 1 ShuffleBytes- 00:08:38.351 [2024-11-28 12:39:08.252944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556897290 len:2757 00:08:38.351 [2024-11-28 12:39:08.252974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.351 #41 NEW cov: 12527 ft: 14785 corp: 15/273b lim: 50 exec/s: 41 rss: 75Mb L: 10/48 MS: 1 CopyPart- 00:08:38.351 [2024-11-28 12:39:08.303886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:868082074035424268 len:3085 00:08:38.351 [2024-11-28 12:39:08.303915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.351 [2024-11-28 12:39:08.303991] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:868082074056920076 len:3085 00:08:38.351 [2024-11-28 12:39:08.304008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.351 [2024-11-28 12:39:08.304082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:868082074056920076 len:3085 00:08:38.351 [2024-11-28 12:39:08.304097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.351 [2024-11-28 12:39:08.304186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:868082074056920076 len:3085 00:08:38.351 [2024-11-28 12:39:08.304203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.351 #44 NEW cov: 12527 ft: 14802 corp: 16/320b lim: 50 exec/s: 44 rss: 75Mb L: 47/48 MS: 3 EraseBytes-ChangeByte-InsertRepeatedBytes- 00:08:38.351 [2024-11-28 12:39:08.373955] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14123288431614524426 len:1 00:08:38.351 [2024-11-28 12:39:08.373981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.352 [2024-11-28 12:39:08.374054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:38.352 [2024-11-28 12:39:08.374072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.352 [2024-11-28 12:39:08.374158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:38.352 [2024-11-28 12:39:08.374172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.352 [2024-11-28 12:39:08.374256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:2757 00:08:38.352 [2024-11-28 12:39:08.374274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.352 #46 NEW cov: 12527 ft: 14825 corp: 17/360b lim: 50 exec/s: 46 rss: 75Mb L: 40/48 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:38.352 [2024-11-28 12:39:08.443544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:38.352 [2024-11-28 12:39:08.443572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.352 [2024-11-28 12:39:08.443638] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474771 len:65279 00:08:38.352 [2024-11-28 12:39:08.443654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.612 #47 NEW cov: 12527 ft: 14850 corp: 18/383b lim: 50 exec/s: 47 rss: 75Mb L: 23/48 MS: 1 ShuffleBytes- 00:08:38.612 [2024-11-28 12:39:08.514155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:868082074035424268 len:3085 00:08:38.612 [2024-11-28 12:39:08.514182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.612 [2024-11-28 12:39:08.514271] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:868082074056920076 len:63732 00:08:38.612 [2024-11-28 12:39:08.514290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.612 [2024-11-28 12:39:08.514369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:17578407020329169907 len:3085 00:08:38.612 [2024-11-28 12:39:08.514387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.612 [2024-11-28 12:39:08.514479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:868082074056920076 len:3085 00:08:38.612 [2024-11-28 12:39:08.514496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.612 #48 NEW cov: 12527 ft: 14890 corp: 19/430b lim: 50 exec/s: 48 rss: 75Mb L: 47/48 MS: 1 ChangeBinInt- 00:08:38.612 [2024-11-28 12:39:08.583517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158241650463917066 len:2757 00:08:38.612 [2024-11-28 12:39:08.583544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.612 #49 NEW cov: 12527 ft: 14939 corp: 20/440b lim: 50 exec/s: 49 rss: 75Mb L: 10/48 MS: 1 ChangeByte- 00:08:38.612 [2024-11-28 12:39:08.633521] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4461377274858847221 len:65279 00:08:38.612 [2024-11-28 12:39:08.633547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.612 #50 NEW cov: 12527 ft: 14960 corp: 21/457b lim: 50 exec/s: 50 rss: 75Mb L: 17/48 MS: 1 CrossOver- 00:08:38.612 [2024-11-28 12:39:08.683631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4461377274858847221 len:65279 00:08:38.612 [2024-11-28 12:39:08.683659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.612 #51 NEW cov: 12527 ft: 14965 corp: 22/474b lim: 50 exec/s: 51 rss: 75Mb L: 17/48 MS: 1 ChangeBinInt- 00:08:38.872 [2024-11-28 12:39:08.754244] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:38.872 [2024-11-28 12:39:08.754274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.754341] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474771 len:65279 00:08:38.872 [2024-11-28 12:39:08.754357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.872 #52 NEW cov: 12527 ft: 14997 corp: 23/497b lim: 50 exec/s: 52 rss: 75Mb L: 23/48 MS: 1 ShuffleBytes- 00:08:38.872 [2024-11-28 12:39:08.824110] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:38.872 [2024-11-28 12:39:08.824138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.872 #53 NEW cov: 12527 ft: 15047 corp: 24/510b lim: 50 exec/s: 53 rss: 75Mb L: 13/48 MS: 1 EraseBytes- 00:08:38.872 [2024-11-28 12:39:08.874115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:868082074035424268 len:3085 00:08:38.872 [2024-11-28 12:39:08.874141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.872 #54 NEW cov: 12527 ft: 15051 corp: 25/526b lim: 50 exec/s: 54 rss: 75Mb L: 16/48 MS: 1 CrossOver- 00:08:38.872 [2024-11-28 12:39:08.925154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:14158203167556944906 len:2757 00:08:38.872 [2024-11-28 12:39:08.925182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.925263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2604246222170760228 len:9253 00:08:38.872 [2024-11-28 12:39:08.925280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.925358] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2604246222170760228 len:9253 00:08:38.872 [2024-11-28 12:39:08.925373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.925457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2604246222170760228 len:9253 00:08:38.872 [2024-11-28 12:39:08.925476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.925559] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:2604246222170760228 len:9253 00:08:38.872 [2024-11-28 12:39:08.925577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.872 #55 NEW cov: 12527 ft: 15080 corp: 26/576b lim: 50 exec/s: 55 rss: 75Mb L: 50/50 MS: 1 CopyPart- 00:08:38.872 [2024-11-28 12:39:08.995082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4268070196624440123 len:15164 00:08:38.872 [2024-11-28 12:39:08.995110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.995186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:289421401641401092 len:15164 00:08:38.872 [2024-11-28 12:39:08.995203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.995288] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4268070197446523707 len:15164 00:08:38.872 [2024-11-28 12:39:08.995304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.872 [2024-11-28 12:39:08.995397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4268070197446523707 len:15164 00:08:38.872 [2024-11-28 12:39:08.995416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.132 #56 NEW cov: 12527 ft: 15118 corp: 27/620b lim: 50 exec/s: 56 rss: 75Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:39.132 [2024-11-28 12:39:09.065109] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:868082074035424268 len:3085 00:08:39.132 [2024-11-28 12:39:09.065138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.065218] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:868077676010408972 len:3085 00:08:39.132 [2024-11-28 12:39:09.065238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.065304] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:868082074056920076 len:3085 00:08:39.132 [2024-11-28 12:39:09.065320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.065410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:868082074056920076 len:3085 00:08:39.132 [2024-11-28 12:39:09.065427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.132 #62 NEW cov: 12527 ft: 15140 corp: 28/667b lim: 50 exec/s: 62 rss: 75Mb L: 47/50 MS: 1 ChangeBinInt- 00:08:39.132 [2024-11-28 12:39:09.114889] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:39.132 [2024-11-28 12:39:09.114919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.114983] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474771 len:15871 00:08:39.132 [2024-11-28 12:39:09.115000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.132 #63 NEW cov: 12527 ft: 15148 corp: 29/690b lim: 50 exec/s: 63 rss: 75Mb L: 23/50 MS: 1 ChangeByte- 00:08:39.132 [2024-11-28 12:39:09.165509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18374403900871474942 len:65279 00:08:39.132 [2024-11-28 12:39:09.165540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.165608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18374403900871474942 len:65279 00:08:39.132 [2024-11-28 12:39:09.165629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.165683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:6052555320385797886 len:65279 00:08:39.132 [2024-11-28 12:39:09.165702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.132 [2024-11-28 12:39:09.165800] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:6052555320385797886 len:65279 00:08:39.132 [2024-11-28 12:39:09.165817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.132 #64 pulse cov: 12527 ft: 15163 corp: 29/690b lim: 50 exec/s: 32 rss: 75Mb 00:08:39.132 #64 NEW cov: 12527 ft: 15163 corp: 30/734b lim: 50 exec/s: 32 rss: 75Mb L: 44/50 MS: 1 CopyPart- 00:08:39.132 #64 DONE cov: 12527 ft: 15163 corp: 30/734b lim: 50 exec/s: 32 rss: 75Mb 00:08:39.132 Done 64 runs in 2 second(s) 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 20 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4420 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:39.393 12:39:09 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:39.393 [2024-11-28 12:39:09.368226] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:39.393 [2024-11-28 12:39:09.368299] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid613727 ] 00:08:39.652 [2024-11-28 12:39:09.687884] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:39.652 [2024-11-28 12:39:09.735513] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.652 [2024-11-28 12:39:09.756008] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.912 [2024-11-28 12:39:09.808919] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:39.912 [2024-11-28 12:39:09.825028] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:39.912 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.912 INFO: Seed: 2838286505 00:08:39.912 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:39.912 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:39.912 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:39.912 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.912 #2 INITED exec/s: 0 rss: 67Mb 00:08:39.912 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.912 This may also happen if the target rejected all inputs we tried so far 00:08:39.912 [2024-11-28 12:39:09.870613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:39.912 [2024-11-28 12:39:09.870645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.912 [2024-11-28 12:39:09.870681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:39.912 [2024-11-28 12:39:09.870696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.912 [2024-11-28 12:39:09.870756] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:39.913 [2024-11-28 12:39:09.870771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.173 NEW_FUNC[1/718]: 0x481038 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:40.173 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.173 #5 NEW cov: 12358 ft: 12357 corp: 2/71b lim: 90 exec/s: 0 rss: 74Mb L: 70/70 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:40.173 [2024-11-28 12:39:10.220372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.173 [2024-11-28 12:39:10.220416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.173 #7 NEW cov: 12471 ft: 13815 corp: 3/105b lim: 90 exec/s: 0 rss: 74Mb L: 34/70 MS: 2 ChangeByte-CrossOver- 00:08:40.173 [2024-11-28 12:39:10.270239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.173 [2024-11-28 12:39:10.270269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.432 #8 NEW cov: 12477 ft: 14058 corp: 4/139b lim: 90 exec/s: 0 rss: 74Mb L: 34/70 MS: 1 ChangeBit- 00:08:40.432 [2024-11-28 12:39:10.330286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.432 [2024-11-28 12:39:10.330314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.432 #9 NEW cov: 12562 ft: 14352 corp: 5/173b lim: 90 exec/s: 0 rss: 74Mb L: 34/70 MS: 1 ChangeBit- 00:08:40.432 [2024-11-28 12:39:10.390302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.432 [2024-11-28 12:39:10.390329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.432 #10 NEW cov: 12562 ft: 14439 corp: 6/207b lim: 90 exec/s: 0 rss: 74Mb L: 34/70 MS: 1 ShuffleBytes- 00:08:40.432 [2024-11-28 12:39:10.430349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.432 [2024-11-28 12:39:10.430378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.432 #11 NEW cov: 12562 ft: 14564 corp: 7/242b lim: 90 exec/s: 0 rss: 74Mb L: 35/70 MS: 1 InsertByte- 00:08:40.432 [2024-11-28 12:39:10.470766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.432 [2024-11-28 12:39:10.470794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.432 [2024-11-28 12:39:10.470858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:40.432 [2024-11-28 12:39:10.470874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.432 [2024-11-28 12:39:10.470926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:40.432 [2024-11-28 12:39:10.470941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.432 [2024-11-28 12:39:10.470994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:40.433 [2024-11-28 12:39:10.471009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.433 #12 NEW cov: 12562 ft: 14965 corp: 8/323b lim: 90 exec/s: 0 rss: 74Mb L: 81/81 MS: 1 CrossOver- 00:08:40.433 [2024-11-28 12:39:10.530346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.433 [2024-11-28 12:39:10.530373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.692 #13 NEW cov: 12562 ft: 15017 corp: 9/358b lim: 90 exec/s: 0 rss: 75Mb L: 35/81 MS: 1 ChangeByte- 00:08:40.692 [2024-11-28 12:39:10.590398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.692 [2024-11-28 12:39:10.590428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.692 #19 NEW cov: 12562 ft: 15035 corp: 10/382b lim: 90 exec/s: 0 rss: 75Mb L: 24/81 MS: 1 CrossOver- 00:08:40.692 [2024-11-28 12:39:10.650460] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.692 [2024-11-28 12:39:10.650495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.692 #20 NEW cov: 12562 ft: 15199 corp: 11/407b lim: 90 exec/s: 0 rss: 75Mb L: 25/81 MS: 1 InsertByte- 00:08:40.692 [2024-11-28 12:39:10.710447] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.692 [2024-11-28 12:39:10.710483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.692 #21 NEW cov: 12562 ft: 15243 corp: 12/442b lim: 90 exec/s: 0 rss: 75Mb L: 35/81 MS: 1 InsertByte- 00:08:40.692 [2024-11-28 12:39:10.750777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.692 [2024-11-28 12:39:10.750804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.692 [2024-11-28 12:39:10.750841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:40.692 [2024-11-28 12:39:10.750857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.692 [2024-11-28 12:39:10.750909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:40.692 [2024-11-28 12:39:10.750941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.692 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:40.692 #22 NEW cov: 12585 ft: 15327 corp: 13/497b lim: 90 exec/s: 0 rss: 75Mb L: 55/81 MS: 1 CrossOver- 00:08:40.692 [2024-11-28 12:39:10.790500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.692 [2024-11-28 12:39:10.790526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 #23 NEW cov: 12585 ft: 15402 corp: 14/524b lim: 90 exec/s: 0 rss: 75Mb L: 27/81 MS: 1 EraseBytes- 00:08:40.952 [2024-11-28 12:39:10.850813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.952 [2024-11-28 12:39:10.850841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 [2024-11-28 12:39:10.850882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:40.952 [2024-11-28 12:39:10.850897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.952 [2024-11-28 12:39:10.850951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:40.952 [2024-11-28 12:39:10.850965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.952 #24 NEW cov: 12585 ft: 15427 corp: 15/579b lim: 90 exec/s: 24 rss: 75Mb L: 55/81 MS: 1 CopyPart- 00:08:40.952 [2024-11-28 12:39:10.910512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.952 [2024-11-28 12:39:10.910539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 #25 NEW cov: 12585 ft: 15445 corp: 16/614b lim: 90 exec/s: 25 rss: 75Mb L: 35/81 MS: 1 InsertByte- 00:08:40.952 [2024-11-28 12:39:10.950530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.952 [2024-11-28 12:39:10.950556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 #26 NEW cov: 12585 ft: 15490 corp: 17/648b lim: 90 exec/s: 26 rss: 75Mb L: 34/81 MS: 1 ShuffleBytes- 00:08:40.952 [2024-11-28 12:39:10.990577] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.952 [2024-11-28 12:39:10.990605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 #27 NEW cov: 12585 ft: 15518 corp: 18/683b lim: 90 exec/s: 27 rss: 75Mb L: 35/81 MS: 1 CMP- DE: "\001\000\000\000"- 00:08:40.952 [2024-11-28 12:39:11.050580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:40.952 [2024-11-28 12:39:11.050606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.952 #28 NEW cov: 12585 ft: 15519 corp: 19/702b lim: 90 exec/s: 28 rss: 75Mb L: 19/81 MS: 1 EraseBytes- 00:08:41.211 [2024-11-28 12:39:11.090654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.090682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 #29 NEW cov: 12585 ft: 15547 corp: 20/727b lim: 90 exec/s: 29 rss: 75Mb L: 25/81 MS: 1 CopyPart- 00:08:41.211 [2024-11-28 12:39:11.150667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.150692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 #30 NEW cov: 12585 ft: 15574 corp: 21/752b lim: 90 exec/s: 30 rss: 75Mb L: 25/81 MS: 1 ChangeBit- 00:08:41.211 [2024-11-28 12:39:11.190646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.190672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 #31 NEW cov: 12585 ft: 15606 corp: 22/777b lim: 90 exec/s: 31 rss: 75Mb L: 25/81 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:08:41.211 [2024-11-28 12:39:11.230705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.230732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 #32 NEW cov: 12585 ft: 15632 corp: 23/812b lim: 90 exec/s: 32 rss: 75Mb L: 35/81 MS: 1 ChangeByte- 00:08:41.211 [2024-11-28 12:39:11.270711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.270738] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.211 #33 NEW cov: 12585 ft: 15648 corp: 24/847b lim: 90 exec/s: 33 rss: 75Mb L: 35/81 MS: 1 ChangeBit- 00:08:41.211 [2024-11-28 12:39:11.310721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.211 [2024-11-28 12:39:11.310748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.471 #34 NEW cov: 12585 ft: 15673 corp: 25/872b lim: 90 exec/s: 34 rss: 75Mb L: 25/81 MS: 1 ChangeByte- 00:08:41.471 [2024-11-28 12:39:11.370792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.471 [2024-11-28 12:39:11.370820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.471 #35 NEW cov: 12585 ft: 15686 corp: 26/892b lim: 90 exec/s: 35 rss: 75Mb L: 20/81 MS: 1 EraseBytes- 00:08:41.471 [2024-11-28 12:39:11.410792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.471 [2024-11-28 12:39:11.410819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.471 #36 NEW cov: 12585 ft: 15695 corp: 27/927b lim: 90 exec/s: 36 rss: 75Mb L: 35/81 MS: 1 ChangeBinInt- 00:08:41.471 [2024-11-28 12:39:11.470804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.471 [2024-11-28 12:39:11.470831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.471 #37 NEW cov: 12585 ft: 15736 corp: 28/961b lim: 90 exec/s: 37 rss: 75Mb L: 34/81 MS: 1 ChangeByte- 00:08:41.471 [2024-11-28 12:39:11.511110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.471 [2024-11-28 12:39:11.511136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.471 [2024-11-28 12:39:11.511184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.471 [2024-11-28 12:39:11.511200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.471 [2024-11-28 12:39:11.511252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:41.471 [2024-11-28 12:39:11.511268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.471 #38 NEW cov: 12585 ft: 15743 corp: 29/1022b lim: 90 exec/s: 38 rss: 75Mb L: 61/81 MS: 1 InsertRepeatedBytes- 00:08:41.471 [2024-11-28 12:39:11.570881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.471 [2024-11-28 12:39:11.570908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.730 #39 NEW cov: 12585 ft: 15767 corp: 30/1056b lim: 90 exec/s: 39 rss: 75Mb L: 34/81 MS: 1 ChangeBinInt- 00:08:41.730 [2024-11-28 12:39:11.611300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.730 [2024-11-28 12:39:11.611327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.730 [2024-11-28 12:39:11.611370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.730 [2024-11-28 12:39:11.611386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.730 [2024-11-28 12:39:11.611440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:41.730 [2024-11-28 12:39:11.611456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.730 [2024-11-28 12:39:11.611530] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:41.730 [2024-11-28 12:39:11.611547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.730 #40 NEW cov: 12585 ft: 15848 corp: 31/1139b lim: 90 exec/s: 40 rss: 75Mb L: 83/83 MS: 1 CrossOver- 00:08:41.730 [2024-11-28 12:39:11.650892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.730 [2024-11-28 12:39:11.650920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.730 #41 NEW cov: 12585 ft: 15851 corp: 32/1173b lim: 90 exec/s: 41 rss: 76Mb L: 34/83 MS: 1 ShuffleBytes- 00:08:41.730 [2024-11-28 12:39:11.710937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.731 [2024-11-28 12:39:11.710964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.731 #42 NEW cov: 12585 ft: 15873 corp: 33/1207b lim: 90 exec/s: 42 rss: 76Mb L: 34/83 MS: 1 ChangeByte- 00:08:41.731 [2024-11-28 12:39:11.771286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.731 [2024-11-28 12:39:11.771314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.731 [2024-11-28 12:39:11.771368] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.731 [2024-11-28 12:39:11.771385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.731 [2024-11-28 12:39:11.771440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:41.731 [2024-11-28 12:39:11.771456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.731 #43 NEW cov: 12585 ft: 15880 corp: 34/1262b lim: 90 exec/s: 43 rss: 76Mb L: 55/83 MS: 1 CrossOver- 00:08:41.731 [2024-11-28 12:39:11.811109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.731 [2024-11-28 12:39:11.811137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.731 [2024-11-28 12:39:11.811192] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:41.731 [2024-11-28 12:39:11.811209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.731 #44 NEW cov: 12585 ft: 16179 corp: 35/1298b lim: 90 exec/s: 44 rss: 76Mb L: 36/83 MS: 1 CopyPart- 00:08:41.731 [2024-11-28 12:39:11.851010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:41.731 [2024-11-28 12:39:11.851037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.990 #45 NEW cov: 12585 ft: 16193 corp: 36/1332b lim: 90 exec/s: 22 rss: 76Mb L: 34/83 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:41.990 #45 DONE cov: 12585 ft: 16193 corp: 36/1332b lim: 90 exec/s: 22 rss: 76Mb 00:08:41.990 ###### Recommended dictionary. ###### 00:08:41.990 "\001\000\000\000" # Uses: 1 00:08:41.990 "\000\000\000\000\000\000\000\000" # Uses: 0 00:08:41.990 ###### End of recommended dictionary. ###### 00:08:41.990 Done 45 runs in 2 second(s) 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 21 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4421 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:41.990 12:39:11 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:41.990 [2024-11-28 12:39:12.022587] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:41.991 [2024-11-28 12:39:12.022661] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614069 ] 00:08:42.249 [2024-11-28 12:39:12.341739] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:42.507 [2024-11-28 12:39:12.389867] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.507 [2024-11-28 12:39:12.410957] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.507 [2024-11-28 12:39:12.463488] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.507 [2024-11-28 12:39:12.479593] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:42.507 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.507 INFO: Seed: 1199336333 00:08:42.507 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:42.507 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:42.507 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:42.507 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.507 #2 INITED exec/s: 0 rss: 66Mb 00:08:42.507 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.507 This may also happen if the target rejected all inputs we tried so far 00:08:42.507 [2024-11-28 12:39:12.524510] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:42.507 [2024-11-28 12:39:12.524545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.507 [2024-11-28 12:39:12.524580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:42.507 [2024-11-28 12:39:12.524599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.507 [2024-11-28 12:39:12.524630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:42.507 [2024-11-28 12:39:12.524649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.767 NEW_FUNC[1/718]: 0x484268 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:42.767 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:42.767 #11 NEW cov: 12326 ft: 12332 corp: 2/35b lim: 50 exec/s: 0 rss: 73Mb L: 34/34 MS: 4 CrossOver-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:42.767 [2024-11-28 12:39:12.884535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:42.767 [2024-11-28 12:39:12.884578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.767 [2024-11-28 12:39:12.884629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:42.767 [2024-11-28 12:39:12.884647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.767 [2024-11-28 12:39:12.884677] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:42.767 [2024-11-28 12:39:12.884693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.026 #12 NEW cov: 12446 ft: 12917 corp: 3/69b lim: 50 exec/s: 0 rss: 73Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:43.026 [2024-11-28 12:39:12.974409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.026 [2024-11-28 12:39:12.974442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.026 [2024-11-28 12:39:12.974498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.026 [2024-11-28 12:39:12.974516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.026 [2024-11-28 12:39:12.974547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.026 [2024-11-28 12:39:12.974563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.026 #13 NEW cov: 12452 ft: 13098 corp: 4/104b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 InsertByte- 00:08:43.026 [2024-11-28 12:39:13.064430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.026 [2024-11-28 12:39:13.064460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.026 [2024-11-28 12:39:13.064516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.026 [2024-11-28 12:39:13.064534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.026 [2024-11-28 12:39:13.064565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.026 [2024-11-28 12:39:13.064582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.026 #14 NEW cov: 12537 ft: 13471 corp: 5/139b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ChangeByte- 00:08:43.285 [2024-11-28 12:39:13.154527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.285 [2024-11-28 12:39:13.154559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.285 [2024-11-28 12:39:13.154594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.285 [2024-11-28 12:39:13.154612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.285 [2024-11-28 12:39:13.154644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.285 [2024-11-28 12:39:13.154661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.285 #15 NEW cov: 12537 ft: 13585 corp: 6/174b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 CrossOver- 00:08:43.285 [2024-11-28 12:39:13.244518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.285 [2024-11-28 12:39:13.244550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.285 [2024-11-28 12:39:13.244599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.286 [2024-11-28 12:39:13.244627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.286 [2024-11-28 12:39:13.244658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.286 [2024-11-28 12:39:13.244675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.286 #21 NEW cov: 12537 ft: 13709 corp: 7/209b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ChangeByte- 00:08:43.286 [2024-11-28 12:39:13.334515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.286 [2024-11-28 12:39:13.334547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.286 [2024-11-28 12:39:13.334580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.286 [2024-11-28 12:39:13.334598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.286 [2024-11-28 12:39:13.334629] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.286 [2024-11-28 12:39:13.334646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.286 #22 NEW cov: 12537 ft: 13869 corp: 8/244b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 InsertByte- 00:08:43.286 [2024-11-28 12:39:13.394564] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.286 [2024-11-28 12:39:13.394596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.286 [2024-11-28 12:39:13.394630] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.286 [2024-11-28 12:39:13.394648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.286 [2024-11-28 12:39:13.394680] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.286 [2024-11-28 12:39:13.394697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.546 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:43.546 #23 NEW cov: 12560 ft: 14048 corp: 9/279b lim: 50 exec/s: 0 rss: 74Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:43.546 [2024-11-28 12:39:13.484555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.546 [2024-11-28 12:39:13.484585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.484618] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.546 [2024-11-28 12:39:13.484636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.484667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.546 [2024-11-28 12:39:13.484683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.546 #24 NEW cov: 12560 ft: 14083 corp: 10/317b lim: 50 exec/s: 24 rss: 74Mb L: 38/38 MS: 1 InsertRepeatedBytes- 00:08:43.546 [2024-11-28 12:39:13.574668] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.546 [2024-11-28 12:39:13.574699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.574747] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.546 [2024-11-28 12:39:13.574764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.574795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.546 [2024-11-28 12:39:13.574812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.574841] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:43.546 [2024-11-28 12:39:13.574857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.546 #25 NEW cov: 12560 ft: 14450 corp: 11/363b lim: 50 exec/s: 25 rss: 74Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:08:43.546 [2024-11-28 12:39:13.634553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.546 [2024-11-28 12:39:13.634592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.634643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.546 [2024-11-28 12:39:13.634661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.546 [2024-11-28 12:39:13.634692] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.546 [2024-11-28 12:39:13.634710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.806 #26 NEW cov: 12560 ft: 14469 corp: 12/398b lim: 50 exec/s: 26 rss: 74Mb L: 35/46 MS: 1 ShuffleBytes- 00:08:43.806 [2024-11-28 12:39:13.694591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.806 [2024-11-28 12:39:13.694622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.694654] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.806 [2024-11-28 12:39:13.694672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.694703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.806 [2024-11-28 12:39:13.694724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.806 #27 NEW cov: 12560 ft: 14492 corp: 13/433b lim: 50 exec/s: 27 rss: 74Mb L: 35/46 MS: 1 ChangeBit- 00:08:43.806 [2024-11-28 12:39:13.744526] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.806 [2024-11-28 12:39:13.744557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.744591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.806 [2024-11-28 12:39:13.744608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.806 #28 NEW cov: 12560 ft: 14836 corp: 14/456b lim: 50 exec/s: 28 rss: 74Mb L: 23/46 MS: 1 EraseBytes- 00:08:43.806 [2024-11-28 12:39:13.834696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.806 [2024-11-28 12:39:13.834726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.834759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.806 [2024-11-28 12:39:13.834776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.834807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.806 [2024-11-28 12:39:13.834823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.834853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:43.806 [2024-11-28 12:39:13.834869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.806 #34 NEW cov: 12560 ft: 14860 corp: 15/496b lim: 50 exec/s: 34 rss: 74Mb L: 40/46 MS: 1 InsertRepeatedBytes- 00:08:43.806 [2024-11-28 12:39:13.894659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:43.806 [2024-11-28 12:39:13.894690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.894722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:43.806 [2024-11-28 12:39:13.894740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.894770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:43.806 [2024-11-28 12:39:13.894786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.806 [2024-11-28 12:39:13.894815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:43.806 [2024-11-28 12:39:13.894831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.065 #35 NEW cov: 12560 ft: 14874 corp: 16/543b lim: 50 exec/s: 35 rss: 74Mb L: 47/47 MS: 1 InsertRepeatedBytes- 00:08:44.065 [2024-11-28 12:39:13.984682] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.065 [2024-11-28 12:39:13.984712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.065 [2024-11-28 12:39:13.984761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.065 [2024-11-28 12:39:13.984779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.065 [2024-11-28 12:39:13.984816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.065 [2024-11-28 12:39:13.984833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.065 #36 NEW cov: 12560 ft: 14915 corp: 17/578b lim: 50 exec/s: 36 rss: 74Mb L: 35/47 MS: 1 ChangeBit- 00:08:44.065 [2024-11-28 12:39:14.044650] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.065 [2024-11-28 12:39:14.044681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.065 [2024-11-28 12:39:14.044715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.065 [2024-11-28 12:39:14.044733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.065 [2024-11-28 12:39:14.044764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.065 [2024-11-28 12:39:14.044781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.065 #37 NEW cov: 12560 ft: 14943 corp: 18/616b lim: 50 exec/s: 37 rss: 74Mb L: 38/47 MS: 1 CopyPart- 00:08:44.065 [2024-11-28 12:39:14.134693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.065 [2024-11-28 12:39:14.134725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.065 [2024-11-28 12:39:14.134761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.065 [2024-11-28 12:39:14.134779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.065 #38 NEW cov: 12560 ft: 14989 corp: 19/638b lim: 50 exec/s: 38 rss: 74Mb L: 22/47 MS: 1 EraseBytes- 00:08:44.325 [2024-11-28 12:39:14.194754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.325 [2024-11-28 12:39:14.194786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.194820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.325 [2024-11-28 12:39:14.194838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.194870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.325 [2024-11-28 12:39:14.194887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.325 #39 NEW cov: 12560 ft: 15044 corp: 20/672b lim: 50 exec/s: 39 rss: 74Mb L: 34/47 MS: 1 ShuffleBytes- 00:08:44.325 [2024-11-28 12:39:14.254799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.325 [2024-11-28 12:39:14.254831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.254880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.325 [2024-11-28 12:39:14.254898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.254929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.325 [2024-11-28 12:39:14.254946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.254976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.325 [2024-11-28 12:39:14.254997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.325 #40 NEW cov: 12560 ft: 15059 corp: 21/719b lim: 50 exec/s: 40 rss: 74Mb L: 47/47 MS: 1 CopyPart- 00:08:44.325 [2024-11-28 12:39:14.314652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.325 [2024-11-28 12:39:14.314685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.325 #41 NEW cov: 12560 ft: 15774 corp: 22/736b lim: 50 exec/s: 41 rss: 74Mb L: 17/47 MS: 1 EraseBytes- 00:08:44.325 [2024-11-28 12:39:14.374729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.325 [2024-11-28 12:39:14.374760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.374809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.325 [2024-11-28 12:39:14.374827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.374858] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.325 [2024-11-28 12:39:14.374875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.325 #42 NEW cov: 12560 ft: 15832 corp: 23/770b lim: 50 exec/s: 42 rss: 74Mb L: 34/47 MS: 1 CrossOver- 00:08:44.325 [2024-11-28 12:39:14.424791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.325 [2024-11-28 12:39:14.424820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.424867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.325 [2024-11-28 12:39:14.424886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.424917] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.325 [2024-11-28 12:39:14.424933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.325 [2024-11-28 12:39:14.424962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:44.325 [2024-11-28 12:39:14.424978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:44.584 #43 NEW cov: 12560 ft: 15859 corp: 24/814b lim: 50 exec/s: 43 rss: 74Mb L: 44/47 MS: 1 InsertRepeatedBytes- 00:08:44.584 [2024-11-28 12:39:14.484826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:44.584 [2024-11-28 12:39:14.484858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.584 [2024-11-28 12:39:14.484893] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:44.584 [2024-11-28 12:39:14.484911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.584 [2024-11-28 12:39:14.484943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:44.584 [2024-11-28 12:39:14.484960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.584 #44 NEW cov: 12560 ft: 15871 corp: 25/848b lim: 50 exec/s: 22 rss: 74Mb L: 34/47 MS: 1 ChangeBit- 00:08:44.584 #44 DONE cov: 12560 ft: 15871 corp: 25/848b lim: 50 exec/s: 22 rss: 74Mb 00:08:44.584 Done 44 runs in 2 second(s) 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 22 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4422 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:44.584 12:39:14 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:44.584 [2024-11-28 12:39:14.675665] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:44.584 [2024-11-28 12:39:14.675740] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614411 ] 00:08:45.152 [2024-11-28 12:39:15.003255] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:45.152 [2024-11-28 12:39:15.050105] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.152 [2024-11-28 12:39:15.070489] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.152 [2024-11-28 12:39:15.123243] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.152 [2024-11-28 12:39:15.139354] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:45.152 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.152 INFO: Seed: 3858339253 00:08:45.152 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:45.152 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:45.152 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:45.152 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.152 #2 INITED exec/s: 0 rss: 65Mb 00:08:45.152 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.152 This may also happen if the target rejected all inputs we tried so far 00:08:45.152 [2024-11-28 12:39:15.194992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.152 [2024-11-28 12:39:15.195023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.152 [2024-11-28 12:39:15.195069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.152 [2024-11-28 12:39:15.195086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.152 [2024-11-28 12:39:15.195145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.152 [2024-11-28 12:39:15.195161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.411 NEW_FUNC[1/718]: 0x486538 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:45.411 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.411 #12 NEW cov: 12359 ft: 12358 corp: 2/67b lim: 85 exec/s: 0 rss: 73Mb L: 66/66 MS: 5 ShuffleBytes-ChangeByte-CopyPart-ChangeByte-InsertRepeatedBytes- 00:08:45.411 [2024-11-28 12:39:15.515037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.411 [2024-11-28 12:39:15.515080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.411 [2024-11-28 12:39:15.515145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.411 [2024-11-28 12:39:15.515164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.411 [2024-11-28 12:39:15.515225] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.411 [2024-11-28 12:39:15.515243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 #18 NEW cov: 12472 ft: 12959 corp: 3/133b lim: 85 exec/s: 0 rss: 73Mb L: 66/66 MS: 1 CopyPart- 00:08:45.671 [2024-11-28 12:39:15.575132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.671 [2024-11-28 12:39:15.575161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.575204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.671 [2024-11-28 12:39:15.575219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.575274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.671 [2024-11-28 12:39:15.575290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.575345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.671 [2024-11-28 12:39:15.575360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.671 #19 NEW cov: 12478 ft: 13591 corp: 4/203b lim: 85 exec/s: 0 rss: 73Mb L: 70/70 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:45.671 [2024-11-28 12:39:15.635257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.671 [2024-11-28 12:39:15.635286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.635344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.671 [2024-11-28 12:39:15.635358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.635414] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.671 [2024-11-28 12:39:15.635434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 #20 NEW cov: 12572 ft: 14026 corp: 5/269b lim: 85 exec/s: 0 rss: 73Mb L: 66/70 MS: 1 ShuffleBytes- 00:08:45.671 [2024-11-28 12:39:15.675155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.671 [2024-11-28 12:39:15.675182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.675236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.671 [2024-11-28 12:39:15.675252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.675308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.671 [2024-11-28 12:39:15.675324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.675380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.671 [2024-11-28 12:39:15.675396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.671 #21 NEW cov: 12572 ft: 14134 corp: 6/339b lim: 85 exec/s: 0 rss: 73Mb L: 70/70 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:45.671 [2024-11-28 12:39:15.715205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.671 [2024-11-28 12:39:15.715233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.715303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.671 [2024-11-28 12:39:15.715319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.715376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.671 [2024-11-28 12:39:15.715391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.715448] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.671 [2024-11-28 12:39:15.715464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.671 #22 NEW cov: 12572 ft: 14211 corp: 7/413b lim: 85 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 InsertRepeatedBytes- 00:08:45.671 [2024-11-28 12:39:15.755153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.671 [2024-11-28 12:39:15.755179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.755252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.671 [2024-11-28 12:39:15.755269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.755325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.671 [2024-11-28 12:39:15.755341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.671 [2024-11-28 12:39:15.755396] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.671 [2024-11-28 12:39:15.755412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.932 #23 NEW cov: 12572 ft: 14315 corp: 8/483b lim: 85 exec/s: 0 rss: 73Mb L: 70/74 MS: 1 ShuffleBytes- 00:08:45.932 [2024-11-28 12:39:15.815179] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:15.815208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.815257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:15.815273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.815330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:15.815347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.815411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.932 [2024-11-28 12:39:15.815432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.932 #24 NEW cov: 12572 ft: 14421 corp: 9/557b lim: 85 exec/s: 0 rss: 73Mb L: 74/74 MS: 1 CopyPart- 00:08:45.932 [2024-11-28 12:39:15.855031] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:15.855059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.855107] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:15.855123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.855180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:15.855195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 #25 NEW cov: 12572 ft: 14473 corp: 10/618b lim: 85 exec/s: 0 rss: 73Mb L: 61/74 MS: 1 EraseBytes- 00:08:45.932 [2024-11-28 12:39:15.915054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:15.915081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.915128] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:15.915144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.915200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:15.915215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 #26 NEW cov: 12572 ft: 14589 corp: 11/684b lim: 85 exec/s: 0 rss: 73Mb L: 66/74 MS: 1 ChangeBit- 00:08:45.932 [2024-11-28 12:39:15.955267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:15.955294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.955358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:15.955374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.955430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:15.955447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:15.955505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.932 [2024-11-28 12:39:15.955521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.932 #27 NEW cov: 12572 ft: 14664 corp: 12/755b lim: 85 exec/s: 0 rss: 73Mb L: 71/74 MS: 1 InsertByte- 00:08:45.932 [2024-11-28 12:39:16.015286] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:16.015314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.015382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:16.015399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.015457] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:16.015477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.015537] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.932 [2024-11-28 12:39:16.015554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:45.932 #28 NEW cov: 12572 ft: 14706 corp: 13/825b lim: 85 exec/s: 0 rss: 73Mb L: 70/74 MS: 1 ChangeByte- 00:08:45.932 [2024-11-28 12:39:16.055369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:45.932 [2024-11-28 12:39:16.055396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.055443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:45.932 [2024-11-28 12:39:16.055457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.055517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:45.932 [2024-11-28 12:39:16.055533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.932 [2024-11-28 12:39:16.055590] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:45.932 [2024-11-28 12:39:16.055606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.192 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:46.192 #29 NEW cov: 12595 ft: 14794 corp: 14/905b lim: 85 exec/s: 0 rss: 74Mb L: 80/80 MS: 1 InsertRepeatedBytes- 00:08:46.192 [2024-11-28 12:39:16.095494] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.192 [2024-11-28 12:39:16.095521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.192 [2024-11-28 12:39:16.095580] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.192 [2024-11-28 12:39:16.095596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.192 [2024-11-28 12:39:16.095667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.192 [2024-11-28 12:39:16.095682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.192 [2024-11-28 12:39:16.095741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.192 [2024-11-28 12:39:16.095758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.192 [2024-11-28 12:39:16.095822] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:46.192 [2024-11-28 12:39:16.095842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.192 #30 NEW cov: 12595 ft: 14896 corp: 15/990b lim: 85 exec/s: 0 rss: 74Mb L: 85/85 MS: 1 CrossOver- 00:08:46.192 [2024-11-28 12:39:16.135420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.192 [2024-11-28 12:39:16.135448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.192 [2024-11-28 12:39:16.135508] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.193 [2024-11-28 12:39:16.135531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.135587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.193 [2024-11-28 12:39:16.135603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.135660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.193 [2024-11-28 12:39:16.135677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.193 #31 NEW cov: 12595 ft: 14933 corp: 16/1070b lim: 85 exec/s: 31 rss: 74Mb L: 80/85 MS: 1 ChangeByte- 00:08:46.193 [2024-11-28 12:39:16.195423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.193 [2024-11-28 12:39:16.195451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.195509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.193 [2024-11-28 12:39:16.195526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.195582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.193 [2024-11-28 12:39:16.195598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.195653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.193 [2024-11-28 12:39:16.195669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.193 #32 NEW cov: 12595 ft: 14941 corp: 17/1140b lim: 85 exec/s: 32 rss: 74Mb L: 70/85 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:46.193 [2024-11-28 12:39:16.255400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.193 [2024-11-28 12:39:16.255428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.255488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.193 [2024-11-28 12:39:16.255504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.255559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.193 [2024-11-28 12:39:16.255578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.255636] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.193 [2024-11-28 12:39:16.255652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.193 #33 NEW cov: 12595 ft: 14964 corp: 18/1214b lim: 85 exec/s: 33 rss: 74Mb L: 74/85 MS: 1 ShuffleBytes- 00:08:46.193 [2024-11-28 12:39:16.315475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.193 [2024-11-28 12:39:16.315503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.315562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.193 [2024-11-28 12:39:16.315577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.315634] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.193 [2024-11-28 12:39:16.315651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.193 [2024-11-28 12:39:16.315707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.193 [2024-11-28 12:39:16.315721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.453 #34 NEW cov: 12595 ft: 14979 corp: 19/1297b lim: 85 exec/s: 34 rss: 74Mb L: 83/85 MS: 1 CopyPart- 00:08:46.453 [2024-11-28 12:39:16.355411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.453 [2024-11-28 12:39:16.355437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.355499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.453 [2024-11-28 12:39:16.355516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.355587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.453 [2024-11-28 12:39:16.355603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.355660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.453 [2024-11-28 12:39:16.355676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.453 #35 NEW cov: 12595 ft: 15021 corp: 20/1381b lim: 85 exec/s: 35 rss: 74Mb L: 84/85 MS: 1 InsertByte- 00:08:46.453 [2024-11-28 12:39:16.415488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.453 [2024-11-28 12:39:16.415516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.415586] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.453 [2024-11-28 12:39:16.415602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.415659] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.453 [2024-11-28 12:39:16.415675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.415733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.453 [2024-11-28 12:39:16.415752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.453 #36 NEW cov: 12595 ft: 15030 corp: 21/1463b lim: 85 exec/s: 36 rss: 74Mb L: 82/85 MS: 1 CMP- DE: "\017\000\000\000\000\000\000\000"- 00:08:46.453 [2024-11-28 12:39:16.475491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.453 [2024-11-28 12:39:16.475518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.475589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.453 [2024-11-28 12:39:16.475606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.475661] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.453 [2024-11-28 12:39:16.475677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.475734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.453 [2024-11-28 12:39:16.475750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.453 #39 NEW cov: 12595 ft: 15065 corp: 22/1545b lim: 85 exec/s: 39 rss: 74Mb L: 82/85 MS: 3 ChangeBit-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:46.453 [2024-11-28 12:39:16.515484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.453 [2024-11-28 12:39:16.515511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.515566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.453 [2024-11-28 12:39:16.515582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.453 [2024-11-28 12:39:16.515638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.453 [2024-11-28 12:39:16.515653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.454 [2024-11-28 12:39:16.515711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.454 [2024-11-28 12:39:16.515726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.454 #40 NEW cov: 12595 ft: 15115 corp: 23/1627b lim: 85 exec/s: 40 rss: 74Mb L: 82/85 MS: 1 PersAutoDict- DE: "\017\000\000\000\000\000\000\000"- 00:08:46.454 [2024-11-28 12:39:16.575404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.454 [2024-11-28 12:39:16.575430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.454 [2024-11-28 12:39:16.575492] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.454 [2024-11-28 12:39:16.575509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.454 [2024-11-28 12:39:16.575566] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.454 [2024-11-28 12:39:16.575582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.714 #41 NEW cov: 12595 ft: 15163 corp: 24/1688b lim: 85 exec/s: 41 rss: 74Mb L: 61/85 MS: 1 CopyPart- 00:08:46.714 [2024-11-28 12:39:16.635394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.714 [2024-11-28 12:39:16.635424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.635463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.714 [2024-11-28 12:39:16.635483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.635539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.714 [2024-11-28 12:39:16.635554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.714 #42 NEW cov: 12595 ft: 15196 corp: 25/1754b lim: 85 exec/s: 42 rss: 74Mb L: 66/85 MS: 1 ChangeByte- 00:08:46.714 [2024-11-28 12:39:16.695104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.714 [2024-11-28 12:39:16.695131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.714 #43 NEW cov: 12595 ft: 16009 corp: 26/1784b lim: 85 exec/s: 43 rss: 74Mb L: 30/85 MS: 1 CrossOver- 00:08:46.714 [2024-11-28 12:39:16.755663] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.714 [2024-11-28 12:39:16.755691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.755740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.714 [2024-11-28 12:39:16.755755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.755808] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.714 [2024-11-28 12:39:16.755823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.755876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.714 [2024-11-28 12:39:16.755892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.714 #44 NEW cov: 12595 ft: 16019 corp: 27/1866b lim: 85 exec/s: 44 rss: 74Mb L: 82/85 MS: 1 ChangeBit- 00:08:46.714 [2024-11-28 12:39:16.815671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.714 [2024-11-28 12:39:16.815699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.815751] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.714 [2024-11-28 12:39:16.815767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.815821] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.714 [2024-11-28 12:39:16.815837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.714 [2024-11-28 12:39:16.815892] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.714 [2024-11-28 12:39:16.815907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.714 #45 NEW cov: 12595 ft: 16049 corp: 28/1940b lim: 85 exec/s: 45 rss: 74Mb L: 74/85 MS: 1 CMP- DE: "\000\000\000\000\000\000\000?"- 00:08:46.974 [2024-11-28 12:39:16.855482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:16.855510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.855551] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:16.855568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.855623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:16.855640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 #46 NEW cov: 12595 ft: 16058 corp: 29/1993b lim: 85 exec/s: 46 rss: 74Mb L: 53/85 MS: 1 CrossOver- 00:08:46.974 [2024-11-28 12:39:16.915887] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:16.915913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.915985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:16.916001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.916056] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:16.916072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.916127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.974 [2024-11-28 12:39:16.916142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.916199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:46.974 [2024-11-28 12:39:16.916215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:46.974 #47 NEW cov: 12595 ft: 16075 corp: 30/2078b lim: 85 exec/s: 47 rss: 74Mb L: 85/85 MS: 1 CopyPart- 00:08:46.974 [2024-11-28 12:39:16.955731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:16.955757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.955827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:16.955843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.955900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:16.955915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.955969] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.974 [2024-11-28 12:39:16.955984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.974 #48 NEW cov: 12595 ft: 16083 corp: 31/2148b lim: 85 exec/s: 48 rss: 75Mb L: 70/85 MS: 1 ShuffleBytes- 00:08:46.974 [2024-11-28 12:39:16.995603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:16.995629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.995667] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:16.995685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:16.995740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:16.995755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 #49 NEW cov: 12595 ft: 16088 corp: 32/2215b lim: 85 exec/s: 49 rss: 75Mb L: 67/85 MS: 1 InsertByte- 00:08:46.974 [2024-11-28 12:39:17.035801] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:17.035828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.035900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:17.035916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.035972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:17.035987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.036043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.974 [2024-11-28 12:39:17.036058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.974 #50 NEW cov: 12595 ft: 16103 corp: 33/2286b lim: 85 exec/s: 50 rss: 75Mb L: 71/85 MS: 1 InsertByte- 00:08:46.974 [2024-11-28 12:39:17.095882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:46.974 [2024-11-28 12:39:17.095909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.095966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:46.974 [2024-11-28 12:39:17.095982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.096038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:46.974 [2024-11-28 12:39:17.096053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.974 [2024-11-28 12:39:17.096108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:46.975 [2024-11-28 12:39:17.096124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.241 #51 NEW cov: 12595 ft: 16116 corp: 34/2368b lim: 85 exec/s: 51 rss: 75Mb L: 82/85 MS: 1 ChangeBit- 00:08:47.241 [2024-11-28 12:39:17.155929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.241 [2024-11-28 12:39:17.155957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.241 [2024-11-28 12:39:17.156023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.241 [2024-11-28 12:39:17.156041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.241 [2024-11-28 12:39:17.156096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:47.241 [2024-11-28 12:39:17.156112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.241 [2024-11-28 12:39:17.156167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:47.241 [2024-11-28 12:39:17.156188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.241 #52 NEW cov: 12595 ft: 16123 corp: 35/2448b lim: 85 exec/s: 26 rss: 75Mb L: 80/85 MS: 1 CMP- DE: "J\037\242|\024\276J\000"- 00:08:47.241 #52 DONE cov: 12595 ft: 16123 corp: 35/2448b lim: 85 exec/s: 26 rss: 75Mb 00:08:47.241 ###### Recommended dictionary. ###### 00:08:47.241 "\002\000\000\000" # Uses: 2 00:08:47.241 "\017\000\000\000\000\000\000\000" # Uses: 1 00:08:47.241 "\000\000\000\000\000\000\000?" # Uses: 0 00:08:47.241 "J\037\242|\024\276J\000" # Uses: 0 00:08:47.241 ###### End of recommended dictionary. ###### 00:08:47.241 Done 52 runs in 2 second(s) 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 23 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4423 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:47.241 12:39:17 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:47.241 [2024-11-28 12:39:17.315960] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:47.241 [2024-11-28 12:39:17.316022] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid614759 ] 00:08:47.502 [2024-11-28 12:39:17.557229] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:47.502 [2024-11-28 12:39:17.604651] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.502 [2024-11-28 12:39:17.620151] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.762 [2024-11-28 12:39:17.672954] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.762 [2024-11-28 12:39:17.689066] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:47.762 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.762 INFO: Seed: 2113365373 00:08:47.762 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:47.762 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:47.762 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:47.762 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.762 #2 INITED exec/s: 0 rss: 66Mb 00:08:47.762 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.762 This may also happen if the target rejected all inputs we tried so far 00:08:47.762 [2024-11-28 12:39:17.744660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:47.762 [2024-11-28 12:39:17.744690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.762 [2024-11-28 12:39:17.744755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:47.762 [2024-11-28 12:39:17.744772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.762 [2024-11-28 12:39:17.744828] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:47.762 [2024-11-28 12:39:17.744844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.762 [2024-11-28 12:39:17.744901] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:47.762 [2024-11-28 12:39:17.744916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.022 NEW_FUNC[1/717]: 0x489778 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:48.022 NEW_FUNC[2/717]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.022 #15 NEW cov: 12292 ft: 12291 corp: 2/24b lim: 25 exec/s: 0 rss: 73Mb L: 23/23 MS: 3 InsertByte-ChangeBit-InsertRepeatedBytes- 00:08:48.022 [2024-11-28 12:39:18.064633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.022 [2024-11-28 12:39:18.064670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.022 [2024-11-28 12:39:18.064716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.022 [2024-11-28 12:39:18.064733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.022 [2024-11-28 12:39:18.064787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.022 [2024-11-28 12:39:18.064802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.022 #16 NEW cov: 12405 ft: 13449 corp: 3/43b lim: 25 exec/s: 0 rss: 73Mb L: 19/23 MS: 1 CrossOver- 00:08:48.022 [2024-11-28 12:39:18.124647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.022 [2024-11-28 12:39:18.124677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.022 [2024-11-28 12:39:18.124741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.023 [2024-11-28 12:39:18.124756] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.023 [2024-11-28 12:39:18.124811] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.023 [2024-11-28 12:39:18.124826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.023 [2024-11-28 12:39:18.124880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.023 [2024-11-28 12:39:18.124898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.283 #17 NEW cov: 12411 ft: 13601 corp: 4/66b lim: 25 exec/s: 0 rss: 73Mb L: 23/23 MS: 1 ChangeBit- 00:08:48.283 [2024-11-28 12:39:18.164507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.283 [2024-11-28 12:39:18.164535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.164582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.283 [2024-11-28 12:39:18.164599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.164653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.283 [2024-11-28 12:39:18.164668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.283 #18 NEW cov: 12496 ft: 13850 corp: 5/84b lim: 25 exec/s: 0 rss: 74Mb L: 18/23 MS: 1 EraseBytes- 00:08:48.283 [2024-11-28 12:39:18.224717] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.283 [2024-11-28 12:39:18.224745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.224810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.283 [2024-11-28 12:39:18.224826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.224880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.283 [2024-11-28 12:39:18.224896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.224951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.283 [2024-11-28 12:39:18.224966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.283 #19 NEW cov: 12496 ft: 13946 corp: 6/107b lim: 25 exec/s: 0 rss: 74Mb L: 23/23 MS: 1 ChangeBinInt- 00:08:48.283 [2024-11-28 12:39:18.264768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.283 [2024-11-28 12:39:18.264794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.264865] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.283 [2024-11-28 12:39:18.264881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.264935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.283 [2024-11-28 12:39:18.264951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.265003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.283 [2024-11-28 12:39:18.265019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.265073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:48.283 [2024-11-28 12:39:18.265089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.283 #25 NEW cov: 12496 ft: 14170 corp: 7/132b lim: 25 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 CopyPart- 00:08:48.283 [2024-11-28 12:39:18.324676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.283 [2024-11-28 12:39:18.324702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.283 [2024-11-28 12:39:18.324758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.283 [2024-11-28 12:39:18.324771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.324839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.284 [2024-11-28 12:39:18.324855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.324911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.284 [2024-11-28 12:39:18.324926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.284 #26 NEW cov: 12496 ft: 14236 corp: 8/156b lim: 25 exec/s: 0 rss: 74Mb L: 24/25 MS: 1 CrossOver- 00:08:48.284 [2024-11-28 12:39:18.364699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.284 [2024-11-28 12:39:18.364725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.364781] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.284 [2024-11-28 12:39:18.364796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.364850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.284 [2024-11-28 12:39:18.364865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.364919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.284 [2024-11-28 12:39:18.364935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.284 #32 NEW cov: 12496 ft: 14274 corp: 9/179b lim: 25 exec/s: 0 rss: 74Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:48.284 [2024-11-28 12:39:18.404852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.284 [2024-11-28 12:39:18.404879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.404935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.284 [2024-11-28 12:39:18.404950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.405003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.284 [2024-11-28 12:39:18.405018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.405074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.284 [2024-11-28 12:39:18.405089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.284 [2024-11-28 12:39:18.405142] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:48.284 [2024-11-28 12:39:18.405157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.543 #33 NEW cov: 12496 ft: 14350 corp: 10/204b lim: 25 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:48.543 [2024-11-28 12:39:18.464742] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.543 [2024-11-28 12:39:18.464767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.464836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.543 [2024-11-28 12:39:18.464853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.464904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.543 [2024-11-28 12:39:18.464919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.464974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.543 [2024-11-28 12:39:18.464990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.543 #34 NEW cov: 12496 ft: 14392 corp: 11/228b lim: 25 exec/s: 0 rss: 74Mb L: 24/25 MS: 1 ChangeBit- 00:08:48.543 [2024-11-28 12:39:18.524987] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.543 [2024-11-28 12:39:18.525014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.525071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.543 [2024-11-28 12:39:18.525087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.525141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.543 [2024-11-28 12:39:18.525157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.543 [2024-11-28 12:39:18.525211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.543 [2024-11-28 12:39:18.525227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.525280] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:48.544 [2024-11-28 12:39:18.525295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:48.544 #35 NEW cov: 12496 ft: 14481 corp: 12/253b lim: 25 exec/s: 0 rss: 74Mb L: 25/25 MS: 1 ChangeBit- 00:08:48.544 [2024-11-28 12:39:18.584810] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.544 [2024-11-28 12:39:18.584836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.584890] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.544 [2024-11-28 12:39:18.584906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.584976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.544 [2024-11-28 12:39:18.584992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.585044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.544 [2024-11-28 12:39:18.585060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.544 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:48.544 #36 NEW cov: 12519 ft: 14504 corp: 13/277b lim: 25 exec/s: 0 rss: 74Mb L: 24/25 MS: 1 CrossOver- 00:08:48.544 [2024-11-28 12:39:18.644845] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.544 [2024-11-28 12:39:18.644872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.644927] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.544 [2024-11-28 12:39:18.644941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.644994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.544 [2024-11-28 12:39:18.645008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.544 [2024-11-28 12:39:18.645062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.544 [2024-11-28 12:39:18.645076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.803 #37 NEW cov: 12519 ft: 14595 corp: 14/297b lim: 25 exec/s: 0 rss: 74Mb L: 20/25 MS: 1 InsertByte- 00:08:48.803 [2024-11-28 12:39:18.684853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.803 [2024-11-28 12:39:18.684880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.803 [2024-11-28 12:39:18.684933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.803 [2024-11-28 12:39:18.684946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.803 [2024-11-28 12:39:18.685001] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.803 [2024-11-28 12:39:18.685016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.803 [2024-11-28 12:39:18.685070] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.803 [2024-11-28 12:39:18.685084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.803 #38 NEW cov: 12519 ft: 14622 corp: 15/321b lim: 25 exec/s: 38 rss: 74Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:48.803 [2024-11-28 12:39:18.744899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.803 [2024-11-28 12:39:18.744926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.803 [2024-11-28 12:39:18.744993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.803 [2024-11-28 12:39:18.745010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.803 [2024-11-28 12:39:18.745064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.804 [2024-11-28 12:39:18.745078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.745132] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.804 [2024-11-28 12:39:18.745147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.804 #39 NEW cov: 12519 ft: 14630 corp: 16/345b lim: 25 exec/s: 39 rss: 74Mb L: 24/25 MS: 1 ChangeByte- 00:08:48.804 [2024-11-28 12:39:18.784764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.804 [2024-11-28 12:39:18.784794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.784848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.804 [2024-11-28 12:39:18.784864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.784919] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.804 [2024-11-28 12:39:18.784934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.804 #40 NEW cov: 12519 ft: 14638 corp: 17/360b lim: 25 exec/s: 40 rss: 74Mb L: 15/25 MS: 1 EraseBytes- 00:08:48.804 [2024-11-28 12:39:18.824925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.804 [2024-11-28 12:39:18.824951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.825023] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.804 [2024-11-28 12:39:18.825037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.825094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.804 [2024-11-28 12:39:18.825109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.825162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.804 [2024-11-28 12:39:18.825177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.804 #41 NEW cov: 12519 ft: 14652 corp: 18/383b lim: 25 exec/s: 41 rss: 74Mb L: 23/25 MS: 1 ShuffleBytes- 00:08:48.804 [2024-11-28 12:39:18.864929] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.804 [2024-11-28 12:39:18.864956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.865013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.804 [2024-11-28 12:39:18.865027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.865096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.804 [2024-11-28 12:39:18.865112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.865167] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.804 [2024-11-28 12:39:18.865182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.804 #42 NEW cov: 12519 ft: 14712 corp: 19/407b lim: 25 exec/s: 42 rss: 74Mb L: 24/25 MS: 1 CrossOver- 00:08:48.804 [2024-11-28 12:39:18.905046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:48.804 [2024-11-28 12:39:18.905072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.905145] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:48.804 [2024-11-28 12:39:18.905161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.905215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:48.804 [2024-11-28 12:39:18.905234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.905288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:48.804 [2024-11-28 12:39:18.905304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.804 [2024-11-28 12:39:18.905359] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:48.804 [2024-11-28 12:39:18.905375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.063 #43 NEW cov: 12519 ft: 14727 corp: 20/432b lim: 25 exec/s: 43 rss: 74Mb L: 25/25 MS: 1 ChangeBit- 00:08:49.063 [2024-11-28 12:39:18.964971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.063 [2024-11-28 12:39:18.964998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.063 [2024-11-28 12:39:18.965067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.063 [2024-11-28 12:39:18.965084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:18.965138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.064 [2024-11-28 12:39:18.965154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:18.965210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.064 [2024-11-28 12:39:18.965224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.064 #44 NEW cov: 12519 ft: 14751 corp: 21/453b lim: 25 exec/s: 44 rss: 74Mb L: 21/25 MS: 1 InsertRepeatedBytes- 00:08:49.064 [2024-11-28 12:39:19.024997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.064 [2024-11-28 12:39:19.025023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.025095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.064 [2024-11-28 12:39:19.025110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.025164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.064 [2024-11-28 12:39:19.025179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.025235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.064 [2024-11-28 12:39:19.025250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.064 #45 NEW cov: 12519 ft: 14806 corp: 22/477b lim: 25 exec/s: 45 rss: 75Mb L: 24/25 MS: 1 ChangeBinInt- 00:08:49.064 [2024-11-28 12:39:19.085003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.064 [2024-11-28 12:39:19.085030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.085084] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.064 [2024-11-28 12:39:19.085099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.085158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.064 [2024-11-28 12:39:19.085173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.085227] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.064 [2024-11-28 12:39:19.085241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.064 #46 NEW cov: 12519 ft: 14814 corp: 23/500b lim: 25 exec/s: 46 rss: 75Mb L: 23/25 MS: 1 CopyPart- 00:08:49.064 [2024-11-28 12:39:19.124925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.064 [2024-11-28 12:39:19.124951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.125000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.064 [2024-11-28 12:39:19.125016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.125071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.064 [2024-11-28 12:39:19.125087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.064 #47 NEW cov: 12519 ft: 14888 corp: 24/517b lim: 25 exec/s: 47 rss: 75Mb L: 17/25 MS: 1 EraseBytes- 00:08:49.064 [2024-11-28 12:39:19.185207] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.064 [2024-11-28 12:39:19.185234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.185290] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.064 [2024-11-28 12:39:19.185306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.185360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.064 [2024-11-28 12:39:19.185375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.185430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.064 [2024-11-28 12:39:19.185445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.064 [2024-11-28 12:39:19.185506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.064 [2024-11-28 12:39:19.185521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.324 #48 NEW cov: 12519 ft: 14916 corp: 25/542b lim: 25 exec/s: 48 rss: 75Mb L: 25/25 MS: 1 InsertByte- 00:08:49.324 [2024-11-28 12:39:19.245199] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.324 [2024-11-28 12:39:19.245226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.245284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.324 [2024-11-28 12:39:19.245299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.245354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.324 [2024-11-28 12:39:19.245369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.245424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.324 [2024-11-28 12:39:19.245439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.245499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.324 [2024-11-28 12:39:19.245513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.324 #49 NEW cov: 12519 ft: 14962 corp: 26/567b lim: 25 exec/s: 49 rss: 75Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:49.324 [2024-11-28 12:39:19.305007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.324 [2024-11-28 12:39:19.305035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.305085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.324 [2024-11-28 12:39:19.305102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.305157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.324 [2024-11-28 12:39:19.305172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.324 #50 NEW cov: 12519 ft: 15028 corp: 27/582b lim: 25 exec/s: 50 rss: 75Mb L: 15/25 MS: 1 ShuffleBytes- 00:08:49.324 [2024-11-28 12:39:19.345126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.324 [2024-11-28 12:39:19.345153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.345208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.324 [2024-11-28 12:39:19.345222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.345276] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.324 [2024-11-28 12:39:19.345291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.324 [2024-11-28 12:39:19.345346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.324 [2024-11-28 12:39:19.345361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.324 #51 NEW cov: 12519 ft: 15032 corp: 28/605b lim: 25 exec/s: 51 rss: 75Mb L: 23/25 MS: 1 ChangeBinInt- 00:08:49.324 [2024-11-28 12:39:19.385287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.325 [2024-11-28 12:39:19.385313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.385383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.325 [2024-11-28 12:39:19.385399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.385454] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.325 [2024-11-28 12:39:19.385476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.385532] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.325 [2024-11-28 12:39:19.385547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.385608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.325 [2024-11-28 12:39:19.385629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.325 #52 NEW cov: 12519 ft: 15072 corp: 29/630b lim: 25 exec/s: 52 rss: 75Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:49.325 [2024-11-28 12:39:19.425284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.325 [2024-11-28 12:39:19.425310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.425380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.325 [2024-11-28 12:39:19.425396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.425453] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.325 [2024-11-28 12:39:19.425467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.425524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.325 [2024-11-28 12:39:19.425549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.325 [2024-11-28 12:39:19.425605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.325 [2024-11-28 12:39:19.425619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.585 #53 NEW cov: 12519 ft: 15084 corp: 30/655b lim: 25 exec/s: 53 rss: 75Mb L: 25/25 MS: 1 InsertByte- 00:08:49.585 [2024-11-28 12:39:19.485062] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.585 [2024-11-28 12:39:19.485088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.585 [2024-11-28 12:39:19.485141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.585 [2024-11-28 12:39:19.485157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.585 [2024-11-28 12:39:19.485212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.585 [2024-11-28 12:39:19.485226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.585 #54 NEW cov: 12519 ft: 15096 corp: 31/671b lim: 25 exec/s: 54 rss: 75Mb L: 16/25 MS: 1 EraseBytes- 00:08:49.585 [2024-11-28 12:39:19.524976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.585 [2024-11-28 12:39:19.525002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.585 [2024-11-28 12:39:19.525054] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.585 [2024-11-28 12:39:19.525070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.585 #55 NEW cov: 12519 ft: 15398 corp: 32/683b lim: 25 exec/s: 55 rss: 75Mb L: 12/25 MS: 1 CrossOver- 00:08:49.585 [2024-11-28 12:39:19.585264] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.585 [2024-11-28 12:39:19.585290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.585 [2024-11-28 12:39:19.585362] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.585 [2024-11-28 12:39:19.585385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.585 [2024-11-28 12:39:19.585438] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.585 [2024-11-28 12:39:19.585454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.585514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.586 [2024-11-28 12:39:19.585529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.586 #56 NEW cov: 12519 ft: 15412 corp: 33/707b lim: 25 exec/s: 56 rss: 75Mb L: 24/25 MS: 1 CopyPart- 00:08:49.586 [2024-11-28 12:39:19.625302] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.586 [2024-11-28 12:39:19.625328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.625402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.586 [2024-11-28 12:39:19.625417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.625475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.586 [2024-11-28 12:39:19.625491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.625555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.586 [2024-11-28 12:39:19.625570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.586 #57 NEW cov: 12519 ft: 15445 corp: 34/731b lim: 25 exec/s: 57 rss: 75Mb L: 24/25 MS: 1 ChangeByte- 00:08:49.586 [2024-11-28 12:39:19.665408] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.586 [2024-11-28 12:39:19.665433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.665506] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.586 [2024-11-28 12:39:19.665523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.665602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.586 [2024-11-28 12:39:19.665618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.665675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:49.586 [2024-11-28 12:39:19.665691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.665758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:49.586 [2024-11-28 12:39:19.665778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.586 #58 NEW cov: 12519 ft: 15467 corp: 35/756b lim: 25 exec/s: 58 rss: 75Mb L: 25/25 MS: 1 InsertByte- 00:08:49.586 [2024-11-28 12:39:19.705170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:49.586 [2024-11-28 12:39:19.705196] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.705259] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:49.586 [2024-11-28 12:39:19.705279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.586 [2024-11-28 12:39:19.705337] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:49.586 [2024-11-28 12:39:19.705353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.846 #59 NEW cov: 12519 ft: 15483 corp: 36/771b lim: 25 exec/s: 29 rss: 75Mb L: 15/25 MS: 1 EraseBytes- 00:08:49.846 #59 DONE cov: 12519 ft: 15483 corp: 36/771b lim: 25 exec/s: 29 rss: 75Mb 00:08:49.846 Done 59 runs in 2 second(s) 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@24 -- # local timen=1 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@25 -- # local core=0x1 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # printf %02d 24 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@34 -- # port=4424 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:49.846 12:39:19 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:49.846 [2024-11-28 12:39:19.888396] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:49.846 [2024-11-28 12:39:19.888462] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid615043 ] 00:08:50.105 [2024-11-28 12:39:20.126664] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:50.105 [2024-11-28 12:39:20.172712] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.105 [2024-11-28 12:39:20.188741] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.366 [2024-11-28 12:39:20.241772] tcp.c: 738:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.366 [2024-11-28 12:39:20.257892] tcp.c:1082:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:50.366 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.366 INFO: Seed: 387401413 00:08:50.366 INFO: Loaded 1 modules (389789 inline 8-bit counters): 389789 [0x2af4f4c, 0x2b541e9), 00:08:50.366 INFO: Loaded 1 PC tables (389789 PCs): 389789 [0x2b541f0,0x3146bc0), 00:08:50.366 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:50.366 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.366 #2 INITED exec/s: 0 rss: 66Mb 00:08:50.366 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.366 This may also happen if the target rejected all inputs we tried so far 00:08:50.366 [2024-11-28 12:39:20.326061] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.366 [2024-11-28 12:39:20.326102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.366 [2024-11-28 12:39:20.326188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.366 [2024-11-28 12:39:20.326208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.366 [2024-11-28 12:39:20.326283] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.366 [2024-11-28 12:39:20.326299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.366 [2024-11-28 12:39:20.326394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.366 [2024-11-28 12:39:20.326414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.626 NEW_FUNC[1/718]: 0x48a868 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:50.626 NEW_FUNC[2/718]: 0x49b4e8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.626 #3 NEW cov: 12364 ft: 12365 corp: 2/90b lim: 100 exec/s: 0 rss: 73Mb L: 89/89 MS: 1 InsertRepeatedBytes- 00:08:50.626 [2024-11-28 12:39:20.675805] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.626 [2024-11-28 12:39:20.675850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.626 [2024-11-28 12:39:20.675960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.626 [2024-11-28 12:39:20.675979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.626 [2024-11-28 12:39:20.676075] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.626 [2024-11-28 12:39:20.676092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.626 #4 NEW cov: 12477 ft: 13376 corp: 3/154b lim: 100 exec/s: 0 rss: 73Mb L: 64/89 MS: 1 InsertRepeatedBytes- 00:08:50.626 [2024-11-28 12:39:20.735986] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.626 [2024-11-28 12:39:20.736018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.627 [2024-11-28 12:39:20.736085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.627 [2024-11-28 12:39:20.736105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.627 [2024-11-28 12:39:20.736166] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.627 [2024-11-28 12:39:20.736186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.888 #5 NEW cov: 12486 ft: 13802 corp: 4/218b lim: 100 exec/s: 0 rss: 73Mb L: 64/89 MS: 1 CopyPart- 00:08:50.888 [2024-11-28 12:39:20.806292] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.806320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.806382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.806400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.806476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.806493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.888 #11 NEW cov: 12571 ft: 14051 corp: 5/282b lim: 100 exec/s: 0 rss: 74Mb L: 64/89 MS: 1 ChangeBit- 00:08:50.888 [2024-11-28 12:39:20.856878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.856924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.856997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.857019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.857081] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.857097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.857187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4194304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.857205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.888 #12 NEW cov: 12571 ft: 14156 corp: 6/372b lim: 100 exec/s: 0 rss: 74Mb L: 90/90 MS: 1 InsertByte- 00:08:50.888 [2024-11-28 12:39:20.927386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.927418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.927539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.927557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.927645] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.927659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.927742] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.927758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.927850] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.927867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.888 #13 NEW cov: 12571 ft: 14248 corp: 7/472b lim: 100 exec/s: 0 rss: 74Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:50.888 [2024-11-28 12:39:20.997238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.997266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.997347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.997366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.997447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.997464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.888 [2024-11-28 12:39:20.997558] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:50.888 [2024-11-28 12:39:20.997578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.148 #14 NEW cov: 12571 ft: 14379 corp: 8/558b lim: 100 exec/s: 0 rss: 74Mb L: 86/100 MS: 1 CopyPart- 00:08:51.148 [2024-11-28 12:39:21.047565] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.047594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.047673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.047691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.047773] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.047792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.047879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.047896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.047994] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.048009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.148 #15 NEW cov: 12571 ft: 14409 corp: 9/658b lim: 100 exec/s: 0 rss: 74Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:51.148 [2024-11-28 12:39:21.117603] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069590417248 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.117635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.117716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.117739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.117796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.117812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.117903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.117920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.148 #20 NEW cov: 12571 ft: 14478 corp: 10/740b lim: 100 exec/s: 0 rss: 74Mb L: 82/100 MS: 5 InsertByte-InsertRepeatedBytes-InsertByte-ShuffleBytes-CrossOver- 00:08:51.148 [2024-11-28 12:39:21.167696] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.167727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.167814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.167833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.167906] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.167925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.168026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4194304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.168045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.148 NEW_FUNC[1/1]: 0x1c683a8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:51.148 #21 NEW cov: 12594 ft: 14564 corp: 11/830b lim: 100 exec/s: 0 rss: 74Mb L: 90/100 MS: 1 ShuffleBytes- 00:08:51.148 [2024-11-28 12:39:21.218388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:68719476736 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.218418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.218514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.218531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.218627] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.148 [2024-11-28 12:39:21.218645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.148 [2024-11-28 12:39:21.218735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.149 [2024-11-28 12:39:21.218754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.149 [2024-11-28 12:39:21.218849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.149 [2024-11-28 12:39:21.218870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.149 #22 NEW cov: 12594 ft: 14588 corp: 12/930b lim: 100 exec/s: 0 rss: 74Mb L: 100/100 MS: 1 CMP- DE: "\000\020"- 00:08:51.408 [2024-11-28 12:39:21.297951] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.297982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.298058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.298079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.298146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.298164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.408 #23 NEW cov: 12594 ft: 14606 corp: 13/994b lim: 100 exec/s: 23 rss: 74Mb L: 64/100 MS: 1 CopyPart- 00:08:51.408 [2024-11-28 12:39:21.378323] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.378354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.378447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.378467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.378536] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.378556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.378656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4194304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.378676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.408 #24 NEW cov: 12594 ft: 14618 corp: 14/1084b lim: 100 exec/s: 24 rss: 74Mb L: 90/100 MS: 1 ChangeByte- 00:08:51.408 [2024-11-28 12:39:21.428943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.428974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.429068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.429090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.429148] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.429167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.429259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.429278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.429369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:17506321826814554866 len:62195 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.429390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.408 #25 NEW cov: 12594 ft: 14658 corp: 15/1184b lim: 100 exec/s: 25 rss: 74Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:51.408 [2024-11-28 12:39:21.508748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.508782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.508837] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:197912092999680 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.508857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.508923] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.508945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.408 [2024-11-28 12:39:21.509032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.408 [2024-11-28 12:39:21.509054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.408 #26 NEW cov: 12594 ft: 14676 corp: 16/1271b lim: 100 exec/s: 26 rss: 74Mb L: 87/100 MS: 1 InsertByte- 00:08:51.668 [2024-11-28 12:39:21.558595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.558625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.668 [2024-11-28 12:39:21.558692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.558709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.668 [2024-11-28 12:39:21.558777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:131072 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.558794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.668 #27 NEW cov: 12594 ft: 14700 corp: 17/1335b lim: 100 exec/s: 27 rss: 74Mb L: 64/100 MS: 1 ChangeBinInt- 00:08:51.668 [2024-11-28 12:39:21.629650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:68719476736 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.629681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.668 [2024-11-28 12:39:21.629779] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.629796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.668 [2024-11-28 12:39:21.629887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.668 [2024-11-28 12:39:21.629905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.669 [2024-11-28 12:39:21.629990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.630008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.669 [2024-11-28 12:39:21.630106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.630125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.669 #28 NEW cov: 12594 ft: 14746 corp: 18/1435b lim: 100 exec/s: 28 rss: 74Mb L: 100/100 MS: 1 ChangeByte- 00:08:51.669 [2024-11-28 12:39:21.699133] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.699165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.669 [2024-11-28 12:39:21.699257] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.699276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.669 [2024-11-28 12:39:21.699357] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:64 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.699376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.669 #29 NEW cov: 12594 ft: 14788 corp: 19/1507b lim: 100 exec/s: 29 rss: 74Mb L: 72/100 MS: 1 EraseBytes- 00:08:51.669 [2024-11-28 12:39:21.768701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.669 [2024-11-28 12:39:21.768730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.928 #30 NEW cov: 12594 ft: 15594 corp: 20/1536b lim: 100 exec/s: 30 rss: 74Mb L: 29/100 MS: 1 InsertRepeatedBytes- 00:08:51.928 [2024-11-28 12:39:21.819562] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.819593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.928 [2024-11-28 12:39:21.819666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.819685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.928 [2024-11-28 12:39:21.819756] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.819775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.928 #31 NEW cov: 12594 ft: 15614 corp: 21/1606b lim: 100 exec/s: 31 rss: 74Mb L: 70/100 MS: 1 CrossOver- 00:08:51.928 [2024-11-28 12:39:21.870677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.870706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.928 [2024-11-28 12:39:21.870808] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.870826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.928 [2024-11-28 12:39:21.870916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.928 [2024-11-28 12:39:21.870934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.928 [2024-11-28 12:39:21.871033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.871051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:21.871139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:17506321826814554866 len:62195 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.871155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.929 #32 NEW cov: 12594 ft: 15628 corp: 22/1706b lim: 100 exec/s: 32 rss: 75Mb L: 100/100 MS: 1 ChangeBit- 00:08:51.929 [2024-11-28 12:39:21.940878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.940906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:21.940993] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.941010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:21.941092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.941111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:21.941199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.941214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:21.941307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:4076008178 len:62195 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:21.941323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.929 #33 NEW cov: 12594 ft: 15663 corp: 23/1806b lim: 100 exec/s: 33 rss: 75Mb L: 100/100 MS: 1 CrossOver- 00:08:51.929 [2024-11-28 12:39:22.010347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446742974197924086 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:22.010375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:22.010447] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:22.010464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.929 [2024-11-28 12:39:22.010570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:51.929 [2024-11-28 12:39:22.010587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.929 #34 NEW cov: 12594 ft: 15698 corp: 24/1870b lim: 100 exec/s: 34 rss: 75Mb L: 64/100 MS: 1 ChangeBinInt- 00:08:52.188 [2024-11-28 12:39:22.060822] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744069590417248 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.188 [2024-11-28 12:39:22.060852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.188 [2024-11-28 12:39:22.060931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.188 [2024-11-28 12:39:22.060952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.188 [2024-11-28 12:39:22.061027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.188 [2024-11-28 12:39:22.061044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.188 [2024-11-28 12:39:22.061128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:3318072773 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.188 [2024-11-28 12:39:22.061146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.188 #35 NEW cov: 12594 ft: 15730 corp: 25/1957b lim: 100 exec/s: 35 rss: 75Mb L: 87/100 MS: 1 InsertRepeatedBytes- 00:08:52.188 [2024-11-28 12:39:22.131461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:68719476736 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.188 [2024-11-28 12:39:22.131493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.188 [2024-11-28 12:39:22.131582] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.131601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.131688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.131707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.131796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.131813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.131905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.131924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.189 #36 NEW cov: 12594 ft: 15802 corp: 26/2057b lim: 100 exec/s: 36 rss: 75Mb L: 100/100 MS: 1 CrossOver- 00:08:52.189 [2024-11-28 12:39:22.201581] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.201611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.201687] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.201704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.201782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.201800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.201890] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:0 len:65 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.201909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.202004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.202021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.189 #42 NEW cov: 12594 ft: 15833 corp: 27/2157b lim: 100 exec/s: 42 rss: 75Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:52.189 [2024-11-28 12:39:22.271274] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.271304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.271377] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.271395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.271455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.271476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.189 [2024-11-28 12:39:22.271564] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4194304 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.189 [2024-11-28 12:39:22.271583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.189 #43 NEW cov: 12594 ft: 15867 corp: 28/2247b lim: 100 exec/s: 43 rss: 75Mb L: 90/100 MS: 1 ShuffleBytes- 00:08:52.448 [2024-11-28 12:39:22.321429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.448 [2024-11-28 12:39:22.321460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.448 [2024-11-28 12:39:22.321555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.448 [2024-11-28 12:39:22.321574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.448 [2024-11-28 12:39:22.321639] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16607023625928704 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.448 [2024-11-28 12:39:22.321660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.448 [2024-11-28 12:39:22.321754] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16384 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.448 [2024-11-28 12:39:22.321776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.448 #44 NEW cov: 12594 ft: 15876 corp: 29/2338b lim: 100 exec/s: 22 rss: 75Mb L: 91/100 MS: 1 InsertByte- 00:08:52.448 #44 DONE cov: 12594 ft: 15876 corp: 29/2338b lim: 100 exec/s: 22 rss: 75Mb 00:08:52.448 ###### Recommended dictionary. ###### 00:08:52.448 "\000\020" # Uses: 0 00:08:52.448 ###### End of recommended dictionary. ###### 00:08:52.448 Done 44 runs in 2 second(s) 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:52.448 00:08:52.448 real 1m6.624s 00:08:52.448 user 1m39.280s 00:08:52.448 sys 0m8.577s 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:52.448 12:39:22 llvm_fuzz.nvmf_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:52.448 ************************************ 00:08:52.448 END TEST nvmf_llvm_fuzz 00:08:52.448 ************************************ 00:08:52.448 12:39:22 llvm_fuzz -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:52.448 12:39:22 llvm_fuzz -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:52.448 12:39:22 llvm_fuzz -- fuzz/llvm.sh@20 -- # run_test vfio_llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:52.448 12:39:22 llvm_fuzz -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:52.448 12:39:22 llvm_fuzz -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:52.448 12:39:22 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:08:52.448 ************************************ 00:08:52.448 START TEST vfio_llvm_fuzz 00:08:52.448 ************************************ 00:08:52.448 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1129 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:52.711 * Looking for test storage... 00:08:52.711 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:52.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.711 --rc genhtml_branch_coverage=1 00:08:52.711 --rc genhtml_function_coverage=1 00:08:52.711 --rc genhtml_legend=1 00:08:52.711 --rc geninfo_all_blocks=1 00:08:52.711 --rc geninfo_unexecuted_blocks=1 00:08:52.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.711 ' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:52.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.711 --rc genhtml_branch_coverage=1 00:08:52.711 --rc genhtml_function_coverage=1 00:08:52.711 --rc genhtml_legend=1 00:08:52.711 --rc geninfo_all_blocks=1 00:08:52.711 --rc geninfo_unexecuted_blocks=1 00:08:52.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.711 ' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:52.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.711 --rc genhtml_branch_coverage=1 00:08:52.711 --rc genhtml_function_coverage=1 00:08:52.711 --rc genhtml_legend=1 00:08:52.711 --rc geninfo_all_blocks=1 00:08:52.711 --rc geninfo_unexecuted_blocks=1 00:08:52.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.711 ' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:52.711 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.711 --rc genhtml_branch_coverage=1 00:08:52.711 --rc genhtml_function_coverage=1 00:08:52.711 --rc genhtml_legend=1 00:08:52.711 --rc geninfo_all_blocks=1 00:08:52.711 --rc geninfo_unexecuted_blocks=1 00:08:52.711 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.711 ' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@34 -- # set -e 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@20 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@23 -- # CONFIG_CET=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@37 -- # CONFIG_FUZZER=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:08:52.711 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@56 -- # CONFIG_XNVME=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@72 -- # CONFIG_SHARED=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@76 -- # CONFIG_FC=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/build_config.sh@90 -- # CONFIG_URING=n 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:52.712 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:52.712 #define SPDK_CONFIG_H 00:08:52.712 #define SPDK_CONFIG_AIO_FSDEV 1 00:08:52.712 #define SPDK_CONFIG_APPS 1 00:08:52.712 #define SPDK_CONFIG_ARCH native 00:08:52.712 #undef SPDK_CONFIG_ASAN 00:08:52.712 #undef SPDK_CONFIG_AVAHI 00:08:52.712 #undef SPDK_CONFIG_CET 00:08:52.712 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:08:52.712 #define SPDK_CONFIG_COVERAGE 1 00:08:52.712 #define SPDK_CONFIG_CROSS_PREFIX 00:08:52.712 #undef SPDK_CONFIG_CRYPTO 00:08:52.712 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:52.712 #undef SPDK_CONFIG_CUSTOMOCF 00:08:52.712 #undef SPDK_CONFIG_DAOS 00:08:52.712 #define SPDK_CONFIG_DAOS_DIR 00:08:52.712 #define SPDK_CONFIG_DEBUG 1 00:08:52.712 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:52.712 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:52.712 #define SPDK_CONFIG_DPDK_INC_DIR //var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/include 00:08:52.712 #define SPDK_CONFIG_DPDK_LIB_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:52.712 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:52.712 #undef SPDK_CONFIG_DPDK_UADK 00:08:52.712 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:52.712 #define SPDK_CONFIG_EXAMPLES 1 00:08:52.712 #undef SPDK_CONFIG_FC 00:08:52.712 #define SPDK_CONFIG_FC_PATH 00:08:52.712 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:52.712 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:52.712 #define SPDK_CONFIG_FSDEV 1 00:08:52.712 #undef SPDK_CONFIG_FUSE 00:08:52.712 #define SPDK_CONFIG_FUZZER 1 00:08:52.712 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:52.712 #undef SPDK_CONFIG_GOLANG 00:08:52.712 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:52.712 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:52.712 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:52.712 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:08:52.712 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:52.712 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:52.712 #undef SPDK_CONFIG_HAVE_LZ4 00:08:52.712 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:08:52.712 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:08:52.712 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:52.712 #define SPDK_CONFIG_IDXD 1 00:08:52.712 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:52.712 #undef SPDK_CONFIG_IPSEC_MB 00:08:52.713 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:52.713 #define SPDK_CONFIG_ISAL 1 00:08:52.713 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:52.713 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:52.713 #define SPDK_CONFIG_LIBDIR 00:08:52.713 #undef SPDK_CONFIG_LTO 00:08:52.713 #define SPDK_CONFIG_MAX_LCORES 128 00:08:52.713 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:08:52.713 #define SPDK_CONFIG_NVME_CUSE 1 00:08:52.713 #undef SPDK_CONFIG_OCF 00:08:52.713 #define SPDK_CONFIG_OCF_PATH 00:08:52.713 #define SPDK_CONFIG_OPENSSL_PATH 00:08:52.713 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:52.713 #define SPDK_CONFIG_PGO_DIR 00:08:52.713 #undef SPDK_CONFIG_PGO_USE 00:08:52.713 #define SPDK_CONFIG_PREFIX /usr/local 00:08:52.713 #undef SPDK_CONFIG_RAID5F 00:08:52.713 #undef SPDK_CONFIG_RBD 00:08:52.713 #define SPDK_CONFIG_RDMA 1 00:08:52.713 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:52.713 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:52.713 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:52.713 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:52.713 #undef SPDK_CONFIG_SHARED 00:08:52.713 #undef SPDK_CONFIG_SMA 00:08:52.713 #define SPDK_CONFIG_TESTS 1 00:08:52.713 #undef SPDK_CONFIG_TSAN 00:08:52.713 #define SPDK_CONFIG_UBLK 1 00:08:52.713 #define SPDK_CONFIG_UBSAN 1 00:08:52.713 #undef SPDK_CONFIG_UNIT_TESTS 00:08:52.713 #undef SPDK_CONFIG_URING 00:08:52.713 #define SPDK_CONFIG_URING_PATH 00:08:52.713 #undef SPDK_CONFIG_URING_ZNS 00:08:52.713 #undef SPDK_CONFIG_USDT 00:08:52.713 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:52.713 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:52.713 #define SPDK_CONFIG_VFIO_USER 1 00:08:52.713 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:52.713 #define SPDK_CONFIG_VHOST 1 00:08:52.713 #define SPDK_CONFIG_VIRTIO 1 00:08:52.713 #undef SPDK_CONFIG_VTUNE 00:08:52.713 #define SPDK_CONFIG_VTUNE_DIR 00:08:52.713 #define SPDK_CONFIG_WERROR 1 00:08:52.713 #define SPDK_CONFIG_WPDK_DIR 00:08:52.713 #undef SPDK_CONFIG_XNVME 00:08:52.713 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@15 -- # shopt -s extglob 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@5 -- # export PATH 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@64 -- # TEST_TAG=N/A 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # uname -s 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@68 -- # PM_OS=Linux 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[0]= 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@76 -- # SUDO[1]='sudo -E' 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ Linux == Linux ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power ]] 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@58 -- # : 1 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@62 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@64 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@66 -- # : 1 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@68 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@70 -- # : 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@72 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@74 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@76 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@78 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@80 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@82 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@84 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@86 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@88 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@90 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@92 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@94 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@96 -- # : 0 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@98 -- # : 1 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@100 -- # : 1 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:08:52.713 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@102 -- # : rdma 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@104 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@106 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@108 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@110 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@112 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@114 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@116 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@118 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@120 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@122 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@124 -- # : 1 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@126 -- # : /var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@128 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@130 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@132 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@134 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@136 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@138 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@140 -- # : main 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@142 -- # : true 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@144 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@146 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@148 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@150 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@152 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@154 -- # : 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@156 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@158 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@160 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@162 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@164 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@166 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@169 -- # : 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@171 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@173 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@175 -- # : 1 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@177 -- # : 0 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:52.714 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@191 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@206 -- # cat 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:52.715 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@262 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@269 -- # _LCOV= 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # [[ 1 -eq 1 ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@270 -- # _LCOV=1 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@275 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # export valgrind= 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@279 -- # valgrind= 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # uname -s 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@289 -- # MAKE=make 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j72 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@309 -- # TEST_MODE= 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # [[ -z 615477 ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@331 -- # kill -0 615477 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1678 -- # set_test_storage 2147483648 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@344 -- # local mount target_dir 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.UdeOrn 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@368 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.UdeOrn/tests/vfio /tmp/spdk.UdeOrn 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # df -T 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_devtmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=67108864 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=67108864 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/pmem0 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=ext2 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=4096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=5284429824 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5284425728 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=spdk_root 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=overlay 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=84724277248 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=94500356096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=9776078848 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47245414400 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250178048 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=4763648 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=18894340096 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=18900074496 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=5734400 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=47249674240 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=47250178048 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=503808 00:08:52.976 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # avails["$mount"]=9450020864 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@375 -- # sizes["$mount"]=9450033152 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:08:52.977 * Looking for test storage... 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@381 -- # local target_space new_size 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@385 -- # mount=/ 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@387 -- # target_space=84724277248 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == tmpfs ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ overlay == ramfs ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@393 -- # [[ / == / ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@394 -- # new_size=11990671360 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@395 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.977 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@402 -- # return 0 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1680 -- # set -o errtrace 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1681 -- # shopt -s extdebug 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1682 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1684 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1685 -- # true 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1687 -- # xtrace_fd 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@27 -- # exec 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@29 -- # exec 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@18 -- # set -x 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1692 -- # [[ y == y ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # awk '{print $NF}' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lcov --version 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1693 -- # lt 1.15 2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # IFS=.-: 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@336 -- # read -ra ver1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # IFS=.-: 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@337 -- # read -ra ver2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@338 -- # local 'op=<' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@340 -- # ver1_l=2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@341 -- # ver2_l=1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@344 -- # case "$op" in 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@345 -- # : 1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # decimal 1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@365 -- # ver1[v]=1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # decimal 2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@353 -- # local d=2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@355 -- # echo 2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@366 -- # ver2[v]=2 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- scripts/common.sh@368 -- # return 0 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1694 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # export 'LCOV_OPTS= 00:08:52.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.977 --rc genhtml_branch_coverage=1 00:08:52.977 --rc genhtml_function_coverage=1 00:08:52.977 --rc genhtml_legend=1 00:08:52.977 --rc geninfo_all_blocks=1 00:08:52.977 --rc geninfo_unexecuted_blocks=1 00:08:52.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.977 ' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1706 -- # LCOV_OPTS=' 00:08:52.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.977 --rc genhtml_branch_coverage=1 00:08:52.977 --rc genhtml_function_coverage=1 00:08:52.977 --rc genhtml_legend=1 00:08:52.977 --rc geninfo_all_blocks=1 00:08:52.977 --rc geninfo_unexecuted_blocks=1 00:08:52.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.977 ' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # export 'LCOV=lcov 00:08:52.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.977 --rc genhtml_branch_coverage=1 00:08:52.977 --rc genhtml_function_coverage=1 00:08:52.977 --rc genhtml_legend=1 00:08:52.977 --rc geninfo_all_blocks=1 00:08:52.977 --rc geninfo_unexecuted_blocks=1 00:08:52.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.977 ' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1707 -- # LCOV='lcov 00:08:52.977 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:52.977 --rc genhtml_branch_coverage=1 00:08:52.977 --rc genhtml_function_coverage=1 00:08:52.977 --rc genhtml_legend=1 00:08:52.977 --rc geninfo_all_blocks=1 00:08:52.977 --rc geninfo_unexecuted_blocks=1 00:08:52.977 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:52.977 ' 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@8 -- # pids=() 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@68 -- # fuzz_num=7 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@74 -- # mem_size=0 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@69 -- # local fuzz_num=7 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@70 -- # local time=1 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i = 0 )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.977 12:39:22 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:52.977 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:52.978 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:52.978 12:39:23 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:52.978 [2024-11-28 12:39:23.044520] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:52.978 [2024-11-28 12:39:23.044594] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid615645 ] 00:08:53.238 [2024-11-28 12:39:23.180398] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:53.238 [2024-11-28 12:39:23.225884] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.238 [2024-11-28 12:39:23.251446] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:53.498 INFO: Running with entropic power schedule (0xFF, 100). 00:08:53.498 INFO: Seed: 3546399806 00:08:53.498 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:53.498 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:53.498 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:53.498 INFO: A corpus is not provided, starting from an empty corpus 00:08:53.498 #2 INITED exec/s: 0 rss: 67Mb 00:08:53.498 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:53.498 This may also happen if the target rejected all inputs we tried so far 00:08:53.498 [2024-11-28 12:39:23.488227] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:54.016 NEW_FUNC[1/676]: 0x45e728 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:54.016 NEW_FUNC[2/676]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:54.016 #5 NEW cov: 11236 ft: 11198 corp: 2/7b lim: 6 exec/s: 0 rss: 74Mb L: 6/6 MS: 3 ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:54.016 #21 NEW cov: 11250 ft: 13949 corp: 3/13b lim: 6 exec/s: 0 rss: 75Mb L: 6/6 MS: 1 ChangeByte- 00:08:54.331 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:54.331 #22 NEW cov: 11267 ft: 14679 corp: 4/19b lim: 6 exec/s: 0 rss: 76Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:54.641 #23 NEW cov: 11267 ft: 14956 corp: 5/25b lim: 6 exec/s: 0 rss: 76Mb L: 6/6 MS: 1 CopyPart- 00:08:54.641 #27 NEW cov: 11267 ft: 16146 corp: 6/31b lim: 6 exec/s: 27 rss: 76Mb L: 6/6 MS: 4 CrossOver-CrossOver-ChangeBit-InsertByte- 00:08:54.941 #28 NEW cov: 11267 ft: 17401 corp: 7/37b lim: 6 exec/s: 28 rss: 76Mb L: 6/6 MS: 1 CopyPart- 00:08:54.941 #29 NEW cov: 11267 ft: 17740 corp: 8/43b lim: 6 exec/s: 29 rss: 76Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:55.258 #31 NEW cov: 11267 ft: 18334 corp: 9/49b lim: 6 exec/s: 31 rss: 77Mb L: 6/6 MS: 2 EraseBytes-CopyPart- 00:08:55.258 #37 NEW cov: 11274 ft: 18394 corp: 10/55b lim: 6 exec/s: 37 rss: 77Mb L: 6/6 MS: 1 CMP- DE: "\014\000"- 00:08:55.533 #38 NEW cov: 11274 ft: 18419 corp: 11/61b lim: 6 exec/s: 19 rss: 77Mb L: 6/6 MS: 1 CopyPart- 00:08:55.533 #38 DONE cov: 11274 ft: 18419 corp: 11/61b lim: 6 exec/s: 19 rss: 77Mb 00:08:55.533 ###### Recommended dictionary. ###### 00:08:55.533 "\014\000" # Uses: 0 00:08:55.533 ###### End of recommended dictionary. ###### 00:08:55.533 Done 38 runs in 2 second(s) 00:08:55.533 [2024-11-28 12:39:25.546669] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:55.793 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:55.793 12:39:25 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:55.793 [2024-11-28 12:39:25.809912] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:55.793 [2024-11-28 12:39:25.809989] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid616016 ] 00:08:56.053 [2024-11-28 12:39:25.944969] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:56.053 [2024-11-28 12:39:25.991306] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:56.053 [2024-11-28 12:39:26.017672] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.312 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.312 INFO: Seed: 2016432349 00:08:56.312 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:56.312 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:56.312 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:56.312 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.312 #2 INITED exec/s: 0 rss: 67Mb 00:08:56.312 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.312 This may also happen if the target rejected all inputs we tried so far 00:08:56.312 [2024-11-28 12:39:26.251283] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:56.312 [2024-11-28 12:39:26.310525] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:56.312 [2024-11-28 12:39:26.310550] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:56.312 [2024-11-28 12:39:26.310569] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:56.572 NEW_FUNC[1/677]: 0x45ecc8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:56.572 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.572 #21 NEW cov: 11234 ft: 11078 corp: 2/5b lim: 4 exec/s: 0 rss: 74Mb L: 4/4 MS: 4 CopyPart-CopyPart-CopyPart-InsertByte- 00:08:56.830 [2024-11-28 12:39:26.779373] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:56.830 [2024-11-28 12:39:26.779411] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:56.830 [2024-11-28 12:39:26.779430] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:56.830 NEW_FUNC[1/1]: 0x1f8a108 in spdk_get_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1282 00:08:56.830 #32 NEW cov: 11252 ft: 13783 corp: 3/9b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 ChangeByte- 00:08:57.088 [2024-11-28 12:39:26.989839] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.088 [2024-11-28 12:39:26.989863] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.088 [2024-11-28 12:39:26.989897] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.088 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:57.088 #38 NEW cov: 11269 ft: 14518 corp: 4/13b lim: 4 exec/s: 0 rss: 75Mb L: 4/4 MS: 1 ChangeByte- 00:08:57.088 [2024-11-28 12:39:27.184632] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.088 [2024-11-28 12:39:27.184656] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.088 [2024-11-28 12:39:27.184673] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.346 #39 NEW cov: 11269 ft: 16311 corp: 5/17b lim: 4 exec/s: 39 rss: 76Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:57.346 [2024-11-28 12:39:27.382457] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.346 [2024-11-28 12:39:27.382485] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.346 [2024-11-28 12:39:27.382503] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.605 #49 NEW cov: 11269 ft: 16845 corp: 6/21b lim: 4 exec/s: 49 rss: 76Mb L: 4/4 MS: 5 CrossOver-CrossOver-CopyPart-CrossOver-CopyPart- 00:08:57.605 [2024-11-28 12:39:27.586053] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.605 [2024-11-28 12:39:27.586076] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.605 [2024-11-28 12:39:27.586094] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.605 #50 NEW cov: 11269 ft: 17461 corp: 7/25b lim: 4 exec/s: 50 rss: 76Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:57.864 [2024-11-28 12:39:27.783986] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:57.864 [2024-11-28 12:39:27.784009] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:57.864 [2024-11-28 12:39:27.784028] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:57.864 #51 NEW cov: 11269 ft: 17795 corp: 8/29b lim: 4 exec/s: 51 rss: 76Mb L: 4/4 MS: 1 ShuffleBytes- 00:08:58.124 [2024-11-28 12:39:27.991737] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.124 [2024-11-28 12:39:27.991761] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.124 [2024-11-28 12:39:27.991779] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.124 #55 NEW cov: 11276 ft: 18009 corp: 9/33b lim: 4 exec/s: 55 rss: 76Mb L: 4/4 MS: 4 ChangeBinInt-ChangeBit-ChangeByte-CrossOver- 00:08:58.124 [2024-11-28 12:39:28.179417] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.124 [2024-11-28 12:39:28.179440] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.124 [2024-11-28 12:39:28.179458] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:58.384 #61 NEW cov: 11276 ft: 18440 corp: 10/37b lim: 4 exec/s: 30 rss: 76Mb L: 4/4 MS: 1 CrossOver- 00:08:58.384 #61 DONE cov: 11276 ft: 18440 corp: 10/37b lim: 4 exec/s: 30 rss: 76Mb 00:08:58.384 Done 61 runs in 2 second(s) 00:08:58.384 [2024-11-28 12:39:28.316679] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:58.644 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:58.644 12:39:28 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:58.644 [2024-11-28 12:39:28.581620] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:08:58.644 [2024-11-28 12:39:28.581692] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid616377 ] 00:08:58.644 [2024-11-28 12:39:28.718039] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:08:58.644 [2024-11-28 12:39:28.764778] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.903 [2024-11-28 12:39:28.791686] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.903 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.903 INFO: Seed: 494472276 00:08:58.903 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:08:58.903 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:08:58.903 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:58.903 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.903 #2 INITED exec/s: 0 rss: 67Mb 00:08:58.903 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.903 This may also happen if the target rejected all inputs we tried so far 00:08:59.162 [2024-11-28 12:39:29.032813] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:08:59.162 [2024-11-28 12:39:29.079200] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:59.421 NEW_FUNC[1/677]: 0x45f6b8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:08:59.421 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:59.421 #36 NEW cov: 11214 ft: 11186 corp: 2/9b lim: 8 exec/s: 0 rss: 74Mb L: 8/8 MS: 4 CopyPart-InsertRepeatedBytes-ChangeBit-CopyPart- 00:08:59.680 [2024-11-28 12:39:29.553894] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:59.680 #42 NEW cov: 11228 ft: 14626 corp: 3/17b lim: 8 exec/s: 0 rss: 75Mb L: 8/8 MS: 1 CopyPart- 00:08:59.680 [2024-11-28 12:39:29.745833] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:59.940 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:08:59.940 #53 NEW cov: 11248 ft: 16354 corp: 4/25b lim: 8 exec/s: 0 rss: 76Mb L: 8/8 MS: 1 ChangeBit- 00:08:59.940 [2024-11-28 12:39:29.945778] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:59.940 #59 NEW cov: 11248 ft: 16610 corp: 5/33b lim: 8 exec/s: 59 rss: 77Mb L: 8/8 MS: 1 ChangeBinInt- 00:09:00.199 [2024-11-28 12:39:30.133553] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.199 #60 NEW cov: 11248 ft: 16908 corp: 6/41b lim: 8 exec/s: 60 rss: 77Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:00.199 [2024-11-28 12:39:30.319156] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.458 #66 NEW cov: 11248 ft: 17449 corp: 7/49b lim: 8 exec/s: 66 rss: 77Mb L: 8/8 MS: 1 CMP- DE: "\377\377\377\377\377\377\3773"- 00:09:00.458 [2024-11-28 12:39:30.506576] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.717 #72 NEW cov: 11248 ft: 17520 corp: 8/57b lim: 8 exec/s: 72 rss: 77Mb L: 8/8 MS: 1 ChangeBit- 00:09:00.717 [2024-11-28 12:39:30.688533] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.717 #78 NEW cov: 11255 ft: 17798 corp: 9/65b lim: 8 exec/s: 78 rss: 77Mb L: 8/8 MS: 1 CrossOver- 00:09:00.977 [2024-11-28 12:39:30.870641] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:00.977 #79 NEW cov: 11255 ft: 17931 corp: 10/73b lim: 8 exec/s: 79 rss: 77Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:00.977 [2024-11-28 12:39:31.059721] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:01.245 #80 NEW cov: 11255 ft: 18040 corp: 11/81b lim: 8 exec/s: 40 rss: 77Mb L: 8/8 MS: 1 CopyPart- 00:09:01.245 #80 DONE cov: 11255 ft: 18040 corp: 11/81b lim: 8 exec/s: 40 rss: 77Mb 00:09:01.245 ###### Recommended dictionary. ###### 00:09:01.245 "\377\377\377\377\377\377\3773" # Uses: 0 00:09:01.245 ###### End of recommended dictionary. ###### 00:09:01.245 Done 80 runs in 2 second(s) 00:09:01.245 [2024-11-28 12:39:31.185664] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:01.504 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:01.504 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:01.505 12:39:31 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:01.505 [2024-11-28 12:39:31.449178] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:09:01.505 [2024-11-28 12:39:31.449253] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid616738 ] 00:09:01.505 [2024-11-28 12:39:31.584326] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:01.505 [2024-11-28 12:39:31.630028] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.764 [2024-11-28 12:39:31.654191] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.764 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.764 INFO: Seed: 3357469887 00:09:01.764 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:09:01.764 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:09:01.764 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:01.764 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.764 #2 INITED exec/s: 0 rss: 67Mb 00:09:01.764 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.764 This may also happen if the target rejected all inputs we tried so far 00:09:01.764 [2024-11-28 12:39:31.889875] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:02.282 NEW_FUNC[1/676]: 0x45fda8 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:02.283 NEW_FUNC[2/676]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:02.283 #16 NEW cov: 11221 ft: 11192 corp: 2/33b lim: 32 exec/s: 0 rss: 74Mb L: 32/32 MS: 4 CrossOver-InsertByte-CrossOver-InsertRepeatedBytes- 00:09:02.542 NEW_FUNC[1/1]: 0x20eae58 in spdk_bit_array_get /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/bit_array.c:152 00:09:02.542 #27 NEW cov: 11243 ft: 14331 corp: 3/65b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 1 CrossOver- 00:09:02.802 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:02.802 #28 NEW cov: 11260 ft: 15596 corp: 4/97b lim: 32 exec/s: 0 rss: 76Mb L: 32/32 MS: 1 ChangeByte- 00:09:02.802 #29 NEW cov: 11260 ft: 16582 corp: 5/129b lim: 32 exec/s: 29 rss: 76Mb L: 32/32 MS: 1 CMP- DE: "\377\377\377\377\004\315j\267"- 00:09:03.061 #30 NEW cov: 11260 ft: 17053 corp: 6/161b lim: 32 exec/s: 30 rss: 76Mb L: 32/32 MS: 1 ChangeByte- 00:09:03.320 #36 NEW cov: 11260 ft: 17208 corp: 7/193b lim: 32 exec/s: 36 rss: 76Mb L: 32/32 MS: 1 CopyPart- 00:09:03.320 #37 NEW cov: 11260 ft: 17298 corp: 8/225b lim: 32 exec/s: 37 rss: 76Mb L: 32/32 MS: 1 ChangeByte- 00:09:03.579 #38 NEW cov: 11260 ft: 17362 corp: 9/257b lim: 32 exec/s: 38 rss: 76Mb L: 32/32 MS: 1 CopyPart- 00:09:03.838 #39 NEW cov: 11267 ft: 17474 corp: 10/289b lim: 32 exec/s: 39 rss: 76Mb L: 32/32 MS: 1 CrossOver- 00:09:04.098 #45 NEW cov: 11267 ft: 17516 corp: 11/321b lim: 32 exec/s: 22 rss: 76Mb L: 32/32 MS: 1 ChangeByte- 00:09:04.098 #45 DONE cov: 11267 ft: 17516 corp: 11/321b lim: 32 exec/s: 22 rss: 76Mb 00:09:04.098 ###### Recommended dictionary. ###### 00:09:04.098 "\377\377\377\377\004\315j\267" # Uses: 1 00:09:04.098 ###### End of recommended dictionary. ###### 00:09:04.098 Done 45 runs in 2 second(s) 00:09:04.098 [2024-11-28 12:39:33.986671] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:04.098 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.099 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:04.099 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:04.358 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:04.358 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:04.358 12:39:34 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:04.358 [2024-11-28 12:39:34.256401] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:09:04.358 [2024-11-28 12:39:34.256496] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid617100 ] 00:09:04.358 [2024-11-28 12:39:34.392351] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:04.358 [2024-11-28 12:39:34.438637] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.358 [2024-11-28 12:39:34.465169] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.618 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.618 INFO: Seed: 1881514414 00:09:04.618 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:09:04.618 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:09:04.618 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:04.618 INFO: A corpus is not provided, starting from an empty corpus 00:09:04.618 #2 INITED exec/s: 0 rss: 68Mb 00:09:04.618 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:04.618 This may also happen if the target rejected all inputs we tried so far 00:09:04.618 [2024-11-28 12:39:34.714814] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:05.135 NEW_FUNC[1/677]: 0x460628 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:05.135 NEW_FUNC[2/677]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:05.135 #36 NEW cov: 11225 ft: 11192 corp: 2/33b lim: 32 exec/s: 0 rss: 75Mb L: 32/32 MS: 4 ChangeBit-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:09:05.393 #37 NEW cov: 11241 ft: 14192 corp: 3/65b lim: 32 exec/s: 0 rss: 76Mb L: 32/32 MS: 1 ChangeBit- 00:09:05.652 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:05.652 #38 NEW cov: 11258 ft: 14672 corp: 4/97b lim: 32 exec/s: 0 rss: 77Mb L: 32/32 MS: 1 CopyPart- 00:09:05.652 #39 NEW cov: 11258 ft: 16498 corp: 5/129b lim: 32 exec/s: 39 rss: 77Mb L: 32/32 MS: 1 CopyPart- 00:09:05.911 #40 NEW cov: 11258 ft: 16832 corp: 6/161b lim: 32 exec/s: 40 rss: 77Mb L: 32/32 MS: 1 ChangeBit- 00:09:06.170 #41 NEW cov: 11258 ft: 17467 corp: 7/193b lim: 32 exec/s: 41 rss: 77Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.170 #42 NEW cov: 11258 ft: 17868 corp: 8/225b lim: 32 exec/s: 42 rss: 77Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.429 #43 NEW cov: 11265 ft: 17967 corp: 9/257b lim: 32 exec/s: 43 rss: 77Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.688 #44 NEW cov: 11265 ft: 18294 corp: 10/289b lim: 32 exec/s: 44 rss: 77Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.948 #45 NEW cov: 11265 ft: 18317 corp: 11/321b lim: 32 exec/s: 22 rss: 77Mb L: 32/32 MS: 1 CrossOver- 00:09:06.948 #45 DONE cov: 11265 ft: 18317 corp: 11/321b lim: 32 exec/s: 22 rss: 77Mb 00:09:06.948 Done 45 runs in 2 second(s) 00:09:06.948 [2024-11-28 12:39:36.845674] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:06.948 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:06.948 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:06.948 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:06.948 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:06.948 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:07.207 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:07.207 12:39:37 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:07.207 [2024-11-28 12:39:37.115184] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:09:07.207 [2024-11-28 12:39:37.115257] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid617460 ] 00:09:07.207 [2024-11-28 12:39:37.251178] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:07.207 [2024-11-28 12:39:37.295974] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.207 [2024-11-28 12:39:37.321135] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.466 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.466 INFO: Seed: 435546115 00:09:07.466 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:09:07.466 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:09:07.466 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:07.466 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.466 #2 INITED exec/s: 0 rss: 67Mb 00:09:07.467 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.467 This may also happen if the target rejected all inputs we tried so far 00:09:07.467 [2024-11-28 12:39:37.557038] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:07.726 [2024-11-28 12:39:37.608547] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:07.726 [2024-11-28 12:39:37.608587] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:07.985 NEW_FUNC[1/678]: 0x461028 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:07.985 NEW_FUNC[2/678]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.985 #49 NEW cov: 11237 ft: 11138 corp: 2/14b lim: 13 exec/s: 0 rss: 74Mb L: 13/13 MS: 2 CrossOver-InsertRepeatedBytes- 00:09:07.985 [2024-11-28 12:39:38.088770] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:07.985 [2024-11-28 12:39:38.088817] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.245 #50 NEW cov: 11251 ft: 14644 corp: 3/27b lim: 13 exec/s: 0 rss: 75Mb L: 13/13 MS: 1 ChangeBit- 00:09:08.245 [2024-11-28 12:39:38.282049] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.245 [2024-11-28 12:39:38.282083] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.503 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:08.503 #56 NEW cov: 11271 ft: 15744 corp: 4/40b lim: 13 exec/s: 0 rss: 76Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:08.503 [2024-11-28 12:39:38.475699] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.503 [2024-11-28 12:39:38.475730] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.503 #57 NEW cov: 11271 ft: 16040 corp: 5/53b lim: 13 exec/s: 57 rss: 77Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:08.761 [2024-11-28 12:39:38.663453] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.761 [2024-11-28 12:39:38.663493] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:08.761 #68 NEW cov: 11271 ft: 16706 corp: 6/66b lim: 13 exec/s: 68 rss: 77Mb L: 13/13 MS: 1 ChangeByte- 00:09:08.761 [2024-11-28 12:39:38.856092] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:08.761 [2024-11-28 12:39:38.856129] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.021 #74 NEW cov: 11271 ft: 16824 corp: 7/79b lim: 13 exec/s: 74 rss: 77Mb L: 13/13 MS: 1 ChangeBit- 00:09:09.021 [2024-11-28 12:39:39.039973] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.021 [2024-11-28 12:39:39.040003] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.280 #78 NEW cov: 11271 ft: 17209 corp: 8/92b lim: 13 exec/s: 78 rss: 77Mb L: 13/13 MS: 4 EraseBytes-ChangeBinInt-InsertByte-CMP- DE: "\000\200"- 00:09:09.280 [2024-11-28 12:39:39.228087] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.280 [2024-11-28 12:39:39.228117] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.280 #86 NEW cov: 11278 ft: 17302 corp: 9/105b lim: 13 exec/s: 86 rss: 77Mb L: 13/13 MS: 3 EraseBytes-PersAutoDict-CrossOver- DE: "\000\200"- 00:09:09.539 [2024-11-28 12:39:39.414935] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.539 [2024-11-28 12:39:39.414966] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.539 #92 NEW cov: 11278 ft: 17818 corp: 10/118b lim: 13 exec/s: 92 rss: 77Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:09.539 [2024-11-28 12:39:39.597909] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:09.539 [2024-11-28 12:39:39.597938] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:09.798 #93 NEW cov: 11278 ft: 17905 corp: 11/131b lim: 13 exec/s: 46 rss: 77Mb L: 13/13 MS: 1 ChangeBit- 00:09:09.798 #93 DONE cov: 11278 ft: 17905 corp: 11/131b lim: 13 exec/s: 46 rss: 77Mb 00:09:09.798 ###### Recommended dictionary. ###### 00:09:09.798 "\000\200" # Uses: 1 00:09:09.798 ###### End of recommended dictionary. ###### 00:09:09.798 Done 93 runs in 2 second(s) 00:09:09.798 [2024-11-28 12:39:39.733684] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@23 -- # local timen=1 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@24 -- # local core=0x1 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:10.058 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:10.058 12:39:39 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:10.058 [2024-11-28 12:39:40.004694] Starting SPDK v25.01-pre git sha1 35cd3e84d / DPDK 24.11.0-rc4 initialization... 00:09:10.058 [2024-11-28 12:39:40.004781] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid617829 ] 00:09:10.058 [2024-11-28 12:39:40.144200] pci_dpdk.c: 38:dpdk_pci_init: *NOTICE*: In-development DPDK 24.11.0-rc4 is used. There is no support for it in SPDK. Enabled only for validation. 00:09:10.318 [2024-11-28 12:39:40.192116] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.318 [2024-11-28 12:39:40.217408] reactor.c:1005:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.318 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.318 INFO: Seed: 3340546740 00:09:10.318 INFO: Loaded 1 modules (387025 inline 8-bit counters): 387025 [0x2ab678c, 0x2b14f5d), 00:09:10.318 INFO: Loaded 1 PC tables (387025 PCs): 387025 [0x2b14f60,0x30fcc70), 00:09:10.318 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:10.318 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.318 #2 INITED exec/s: 0 rss: 67Mb 00:09:10.318 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.318 This may also happen if the target rejected all inputs we tried so far 00:09:10.577 [2024-11-28 12:39:40.460179] vfio_user.c:2840:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:10.577 [2024-11-28 12:39:40.514514] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.577 [2024-11-28 12:39:40.514549] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.837 NEW_FUNC[1/678]: 0x461d18 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:10.837 NEW_FUNC[2/678]: 0x464238 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:10.837 #37 NEW cov: 11229 ft: 11197 corp: 2/10b lim: 9 exec/s: 0 rss: 74Mb L: 9/9 MS: 5 InsertRepeatedBytes-CopyPart-ChangeBinInt-CrossOver-InsertByte- 00:09:11.096 [2024-11-28 12:39:40.987566] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.096 [2024-11-28 12:39:40.987610] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.096 #39 NEW cov: 11246 ft: 13381 corp: 3/19b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 2 InsertRepeatedBytes-InsertByte- 00:09:11.096 [2024-11-28 12:39:41.190555] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.096 [2024-11-28 12:39:41.190586] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.355 NEW_FUNC[1/1]: 0x1c347f8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:662 00:09:11.355 #45 NEW cov: 11263 ft: 13994 corp: 4/28b lim: 9 exec/s: 0 rss: 75Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:11.355 [2024-11-28 12:39:41.383929] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.355 [2024-11-28 12:39:41.383960] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.614 #55 NEW cov: 11263 ft: 15437 corp: 5/37b lim: 9 exec/s: 55 rss: 75Mb L: 9/9 MS: 5 ChangeByte-InsertRepeatedBytes-InsertByte-ChangeBinInt-InsertByte- 00:09:11.614 [2024-11-28 12:39:41.590630] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.614 [2024-11-28 12:39:41.590664] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.614 #57 NEW cov: 11263 ft: 15963 corp: 6/46b lim: 9 exec/s: 57 rss: 75Mb L: 9/9 MS: 2 CrossOver-CopyPart- 00:09:11.873 [2024-11-28 12:39:41.783439] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.873 [2024-11-28 12:39:41.783475] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.873 #58 NEW cov: 11263 ft: 16519 corp: 7/55b lim: 9 exec/s: 58 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:11.873 [2024-11-28 12:39:41.983564] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.873 [2024-11-28 12:39:41.983595] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.132 #59 NEW cov: 11263 ft: 16735 corp: 8/64b lim: 9 exec/s: 59 rss: 75Mb L: 9/9 MS: 1 ChangeBit- 00:09:12.132 [2024-11-28 12:39:42.170095] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.132 [2024-11-28 12:39:42.170126] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.392 #60 NEW cov: 11270 ft: 17712 corp: 9/73b lim: 9 exec/s: 60 rss: 75Mb L: 9/9 MS: 1 ChangeByte- 00:09:12.392 [2024-11-28 12:39:42.356873] vfio_user.c:3110:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.392 [2024-11-28 12:39:42.356905] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.392 #66 NEW cov: 11270 ft: 18195 corp: 10/82b lim: 9 exec/s: 33 rss: 75Mb L: 9/9 MS: 1 ShuffleBytes- 00:09:12.392 #66 DONE cov: 11270 ft: 18195 corp: 10/82b lim: 9 exec/s: 33 rss: 75Mb 00:09:12.392 Done 66 runs in 2 second(s) 00:09:12.392 [2024-11-28 12:39:42.492675] vfio_user.c:2802:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i++ )) 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:12.651 00:09:12.651 real 0m20.182s 00:09:12.651 user 0m27.645s 00:09:12.651 sys 0m1.942s 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:12.651 12:39:42 llvm_fuzz.vfio_llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:12.651 ************************************ 00:09:12.651 END TEST vfio_llvm_fuzz 00:09:12.651 ************************************ 00:09:12.651 00:09:12.651 real 1m27.151s 00:09:12.651 user 2m7.078s 00:09:12.651 sys 0m10.737s 00:09:12.651 12:39:42 llvm_fuzz -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:12.651 12:39:42 llvm_fuzz -- common/autotest_common.sh@10 -- # set +x 00:09:12.651 ************************************ 00:09:12.651 END TEST llvm_fuzz 00:09:12.651 ************************************ 00:09:12.910 12:39:42 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:09:12.910 12:39:42 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:09:12.910 12:39:42 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:09:12.910 12:39:42 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:12.910 12:39:42 -- common/autotest_common.sh@10 -- # set +x 00:09:12.910 12:39:42 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:09:12.910 12:39:42 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:09:12.910 12:39:42 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:09:12.910 12:39:42 -- common/autotest_common.sh@10 -- # set +x 00:09:17.098 INFO: APP EXITING 00:09:17.098 INFO: killing all VMs 00:09:17.098 INFO: killing vhost app 00:09:17.098 INFO: EXIT DONE 00:09:19.637 Waiting for block devices as requested 00:09:19.637 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:09:19.637 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:19.637 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:19.637 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:19.637 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:19.637 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:19.637 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:19.897 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:19.897 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:19.897 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:20.156 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:20.156 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:20.156 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:20.415 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:20.415 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:20.415 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:20.676 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:23.966 Cleaning 00:09:23.966 Removing: /dev/shm/spdk_tgt_trace.pid594272 00:09:23.966 Removing: /var/run/dpdk/spdk_pid591930 00:09:23.966 Removing: /var/run/dpdk/spdk_pid593067 00:09:23.966 Removing: /var/run/dpdk/spdk_pid594272 00:09:23.966 Removing: /var/run/dpdk/spdk_pid594805 00:09:23.966 Removing: /var/run/dpdk/spdk_pid595548 00:09:23.966 Removing: /var/run/dpdk/spdk_pid595770 00:09:23.966 Removing: /var/run/dpdk/spdk_pid596638 00:09:23.966 Removing: /var/run/dpdk/spdk_pid596659 00:09:23.966 Removing: /var/run/dpdk/spdk_pid597017 00:09:23.966 Removing: /var/run/dpdk/spdk_pid597400 00:09:23.966 Removing: /var/run/dpdk/spdk_pid597645 00:09:23.966 Removing: /var/run/dpdk/spdk_pid597907 00:09:23.966 Removing: /var/run/dpdk/spdk_pid598207 00:09:23.966 Removing: /var/run/dpdk/spdk_pid598425 00:09:23.966 Removing: /var/run/dpdk/spdk_pid598637 00:09:23.966 Removing: /var/run/dpdk/spdk_pid598933 00:09:23.966 Removing: /var/run/dpdk/spdk_pid599535 00:09:23.966 Removing: /var/run/dpdk/spdk_pid602052 00:09:23.966 Removing: /var/run/dpdk/spdk_pid602263 00:09:23.966 Removing: /var/run/dpdk/spdk_pid602518 00:09:23.966 Removing: /var/run/dpdk/spdk_pid602648 00:09:23.966 Removing: /var/run/dpdk/spdk_pid603037 00:09:23.966 Removing: /var/run/dpdk/spdk_pid603209 00:09:23.966 Removing: /var/run/dpdk/spdk_pid603602 00:09:23.966 Removing: /var/run/dpdk/spdk_pid603774 00:09:23.966 Removing: /var/run/dpdk/spdk_pid603989 00:09:23.967 Removing: /var/run/dpdk/spdk_pid604166 00:09:23.967 Removing: /var/run/dpdk/spdk_pid604364 00:09:23.967 Removing: /var/run/dpdk/spdk_pid604387 00:09:23.967 Removing: /var/run/dpdk/spdk_pid604836 00:09:23.967 Removing: /var/run/dpdk/spdk_pid605031 00:09:23.967 Removing: /var/run/dpdk/spdk_pid605233 00:09:23.967 Removing: /var/run/dpdk/spdk_pid605474 00:09:23.967 Removing: /var/run/dpdk/spdk_pid606070 00:09:23.967 Removing: /var/run/dpdk/spdk_pid606430 00:09:23.967 Removing: /var/run/dpdk/spdk_pid606783 00:09:23.967 Removing: /var/run/dpdk/spdk_pid607142 00:09:23.967 Removing: /var/run/dpdk/spdk_pid607501 00:09:23.967 Removing: /var/run/dpdk/spdk_pid607864 00:09:23.967 Removing: /var/run/dpdk/spdk_pid608212 00:09:23.967 Removing: /var/run/dpdk/spdk_pid608589 00:09:23.967 Removing: /var/run/dpdk/spdk_pid608951 00:09:23.967 Removing: /var/run/dpdk/spdk_pid609294 00:09:23.967 Removing: /var/run/dpdk/spdk_pid609669 00:09:23.967 Removing: /var/run/dpdk/spdk_pid610028 00:09:23.967 Removing: /var/run/dpdk/spdk_pid610382 00:09:23.967 Removing: /var/run/dpdk/spdk_pid610738 00:09:23.967 Removing: /var/run/dpdk/spdk_pid611092 00:09:23.967 Removing: /var/run/dpdk/spdk_pid611427 00:09:23.967 Removing: /var/run/dpdk/spdk_pid611781 00:09:23.967 Removing: /var/run/dpdk/spdk_pid612224 00:09:23.967 Removing: /var/run/dpdk/spdk_pid612697 00:09:23.967 Removing: /var/run/dpdk/spdk_pid613347 00:09:23.967 Removing: /var/run/dpdk/spdk_pid613727 00:09:23.967 Removing: /var/run/dpdk/spdk_pid614069 00:09:23.967 Removing: /var/run/dpdk/spdk_pid614411 00:09:23.967 Removing: /var/run/dpdk/spdk_pid614759 00:09:23.967 Removing: /var/run/dpdk/spdk_pid615043 00:09:23.967 Removing: /var/run/dpdk/spdk_pid615645 00:09:23.967 Removing: /var/run/dpdk/spdk_pid616016 00:09:23.967 Removing: /var/run/dpdk/spdk_pid616377 00:09:23.967 Removing: /var/run/dpdk/spdk_pid616738 00:09:23.967 Removing: /var/run/dpdk/spdk_pid617100 00:09:23.967 Removing: /var/run/dpdk/spdk_pid617460 00:09:23.967 Removing: /var/run/dpdk/spdk_pid617829 00:09:23.967 Clean 00:09:23.967 12:39:53 -- common/autotest_common.sh@1453 -- # return 0 00:09:23.967 12:39:53 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:09:23.967 12:39:53 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:23.967 12:39:53 -- common/autotest_common.sh@10 -- # set +x 00:09:23.967 12:39:53 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:09:23.967 12:39:53 -- common/autotest_common.sh@732 -- # xtrace_disable 00:09:23.967 12:39:53 -- common/autotest_common.sh@10 -- # set +x 00:09:23.967 12:39:53 -- spdk/autotest.sh@392 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:23.967 12:39:53 -- spdk/autotest.sh@394 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:23.967 12:39:53 -- spdk/autotest.sh@394 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:23.967 12:39:53 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:09:23.967 12:39:53 -- spdk/autotest.sh@398 -- # hostname 00:09:23.967 12:39:53 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-49 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:24.226 geninfo: WARNING: invalid characters removed from testname! 00:09:30.793 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_stubs.gcda 00:09:30.793 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/mdns_server.gcda 00:09:37.361 12:40:06 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:45.475 12:40:14 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:49.660 12:40:19 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:56.223 12:40:25 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:00.442 12:40:30 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:07.010 12:40:35 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:10:11.201 12:40:41 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:10:11.201 12:40:41 -- spdk/autorun.sh@1 -- $ timing_finish 00:10:11.201 12:40:41 -- common/autotest_common.sh@738 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt ]] 00:10:11.201 12:40:41 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:10:11.201 12:40:41 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:10:11.201 12:40:41 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:10:11.201 + [[ -n 478604 ]] 00:10:11.201 + sudo kill 478604 00:10:11.469 [Pipeline] } 00:10:11.484 [Pipeline] // stage 00:10:11.489 [Pipeline] } 00:10:11.504 [Pipeline] // timeout 00:10:11.510 [Pipeline] } 00:10:11.524 [Pipeline] // catchError 00:10:11.529 [Pipeline] } 00:10:11.544 [Pipeline] // wrap 00:10:11.550 [Pipeline] } 00:10:11.563 [Pipeline] // catchError 00:10:11.572 [Pipeline] stage 00:10:11.575 [Pipeline] { (Epilogue) 00:10:11.587 [Pipeline] catchError 00:10:11.589 [Pipeline] { 00:10:11.602 [Pipeline] echo 00:10:11.604 Cleanup processes 00:10:11.610 [Pipeline] sh 00:10:11.893 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:11.893 623927 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:11.907 [Pipeline] sh 00:10:12.191 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:10:12.191 ++ grep -v 'sudo pgrep' 00:10:12.191 ++ awk '{print $1}' 00:10:12.191 + sudo kill -9 00:10:12.191 + true 00:10:12.205 [Pipeline] sh 00:10:12.488 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:10:24.701 [Pipeline] sh 00:10:24.982 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:24.982 Artifacts sizes are good 00:10:24.999 [Pipeline] archiveArtifacts 00:10:25.007 Archiving artifacts 00:10:25.187 [Pipeline] sh 00:10:25.489 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:25.516 [Pipeline] cleanWs 00:10:25.534 [WS-CLEANUP] Deleting project workspace... 00:10:25.534 [WS-CLEANUP] Deferred wipeout is used... 00:10:25.571 [WS-CLEANUP] done 00:10:25.572 [Pipeline] } 00:10:25.588 [Pipeline] // catchError 00:10:25.598 [Pipeline] sh 00:10:25.875 + logger -p user.info -t JENKINS-CI 00:10:25.884 [Pipeline] } 00:10:25.898 [Pipeline] // stage 00:10:25.903 [Pipeline] } 00:10:25.919 [Pipeline] // node 00:10:25.924 [Pipeline] End of Pipeline 00:10:25.970 Finished: SUCCESS