00:00:00.001 Started by upstream project "autotest-per-patch" build number 121301 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.025 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.025 The recommended git tool is: git 00:00:00.026 using credential 00000000-0000-0000-0000-000000000002 00:00:00.028 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.046 Fetching changes from the remote Git repository 00:00:00.048 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.087 Using shallow fetch with depth 1 00:00:00.087 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.087 > git --version # timeout=10 00:00:00.146 > git --version # 'git version 2.39.2' 00:00:00.146 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.147 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.147 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:03.646 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:03.657 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:03.670 Checking out Revision f964f6d3463483adf05cc5c086f2abd292e05f1d (FETCH_HEAD) 00:00:03.670 > git config core.sparsecheckout # timeout=10 00:00:03.681 > git read-tree -mu HEAD # timeout=10 00:00:03.698 > git checkout -f f964f6d3463483adf05cc5c086f2abd292e05f1d # timeout=5 00:00:03.719 Commit message: "ansible/roles/custom_facts: Drop nvme features" 00:00:03.720 > git rev-list --no-walk f964f6d3463483adf05cc5c086f2abd292e05f1d # timeout=10 00:00:03.811 [Pipeline] Start of Pipeline 00:00:03.825 [Pipeline] library 00:00:03.827 Loading library shm_lib@master 00:00:03.827 Library shm_lib@master is cached. Copying from home. 00:00:03.842 [Pipeline] node 00:00:03.855 Running on WFP39 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:03.856 [Pipeline] { 00:00:03.866 [Pipeline] catchError 00:00:03.867 [Pipeline] { 00:00:03.877 [Pipeline] wrap 00:00:03.884 [Pipeline] { 00:00:03.889 [Pipeline] stage 00:00:03.890 [Pipeline] { (Prologue) 00:00:04.067 [Pipeline] sh 00:00:04.348 + logger -p user.info -t JENKINS-CI 00:00:04.365 [Pipeline] echo 00:00:04.366 Node: WFP39 00:00:04.371 [Pipeline] sh 00:00:04.665 [Pipeline] setCustomBuildProperty 00:00:04.676 [Pipeline] echo 00:00:04.677 Cleanup processes 00:00:04.680 [Pipeline] sh 00:00:04.958 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.958 1490996 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:04.971 [Pipeline] sh 00:00:05.252 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:05.252 ++ grep -v 'sudo pgrep' 00:00:05.252 ++ awk '{print $1}' 00:00:05.252 + sudo kill -9 00:00:05.252 + true 00:00:05.266 [Pipeline] cleanWs 00:00:05.275 [WS-CLEANUP] Deleting project workspace... 00:00:05.275 [WS-CLEANUP] Deferred wipeout is used... 00:00:05.281 [WS-CLEANUP] done 00:00:05.286 [Pipeline] setCustomBuildProperty 00:00:05.298 [Pipeline] sh 00:00:05.578 + sudo git config --global --replace-all safe.directory '*' 00:00:05.637 [Pipeline] nodesByLabel 00:00:05.639 Found a total of 1 nodes with the 'sorcerer' label 00:00:05.646 [Pipeline] httpRequest 00:00:05.651 HttpMethod: GET 00:00:05.651 URL: http://10.211.164.96/packages/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:05.654 Sending request to url: http://10.211.164.96/packages/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:05.664 Response Code: HTTP/1.1 200 OK 00:00:05.664 Success: Status code 200 is in the accepted range: 200,404 00:00:05.665 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:06.501 [Pipeline] sh 00:00:06.780 + tar --no-same-owner -xf jbp_f964f6d3463483adf05cc5c086f2abd292e05f1d.tar.gz 00:00:06.798 [Pipeline] httpRequest 00:00:06.802 HttpMethod: GET 00:00:06.802 URL: http://10.211.164.96/packages/spdk_13a9f2aa2c8d16e1c0b567a5c5d4ffca17d0a7b1.tar.gz 00:00:06.803 Sending request to url: http://10.211.164.96/packages/spdk_13a9f2aa2c8d16e1c0b567a5c5d4ffca17d0a7b1.tar.gz 00:00:06.819 Response Code: HTTP/1.1 200 OK 00:00:06.820 Success: Status code 200 is in the accepted range: 200,404 00:00:06.820 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_13a9f2aa2c8d16e1c0b567a5c5d4ffca17d0a7b1.tar.gz 00:00:41.186 [Pipeline] sh 00:00:41.472 + tar --no-same-owner -xf spdk_13a9f2aa2c8d16e1c0b567a5c5d4ffca17d0a7b1.tar.gz 00:00:44.024 [Pipeline] sh 00:00:44.305 + git -C spdk log --oneline -n5 00:00:44.305 13a9f2aa2 lib/idxd: deallocate batches on channel creation failure 00:00:44.305 1eae9e764 lib/idxd: simplify early failure in _dsa_alloc_batches 00:00:44.305 cb6e41c35 test/idxd: suppress accel-config leaks 00:00:44.305 b08c61ff7 lib/idxd: fix device reference counting in kernel IDXD 00:00:44.305 8571999d8 test/scheduler: Stop moving all processes between cgroups 00:00:44.318 [Pipeline] } 00:00:44.336 [Pipeline] // stage 00:00:44.344 [Pipeline] stage 00:00:44.346 [Pipeline] { (Prepare) 00:00:44.364 [Pipeline] writeFile 00:00:44.380 [Pipeline] sh 00:00:44.659 + logger -p user.info -t JENKINS-CI 00:00:44.673 [Pipeline] sh 00:00:44.952 + logger -p user.info -t JENKINS-CI 00:00:44.965 [Pipeline] sh 00:00:45.247 + cat autorun-spdk.conf 00:00:45.247 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:45.247 SPDK_TEST_FUZZER_SHORT=1 00:00:45.247 SPDK_TEST_FUZZER=1 00:00:45.247 SPDK_RUN_UBSAN=1 00:00:45.254 RUN_NIGHTLY=0 00:00:45.259 [Pipeline] readFile 00:00:45.288 [Pipeline] withEnv 00:00:45.291 [Pipeline] { 00:00:45.306 [Pipeline] sh 00:00:45.588 + set -ex 00:00:45.588 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:00:45.588 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:45.588 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:45.588 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:45.588 ++ SPDK_TEST_FUZZER=1 00:00:45.588 ++ SPDK_RUN_UBSAN=1 00:00:45.588 ++ RUN_NIGHTLY=0 00:00:45.588 + case $SPDK_TEST_NVMF_NICS in 00:00:45.588 + DRIVERS= 00:00:45.588 + [[ -n '' ]] 00:00:45.588 + exit 0 00:00:45.598 [Pipeline] } 00:00:45.616 [Pipeline] // withEnv 00:00:45.622 [Pipeline] } 00:00:45.639 [Pipeline] // stage 00:00:45.649 [Pipeline] catchError 00:00:45.651 [Pipeline] { 00:00:45.666 [Pipeline] timeout 00:00:45.666 Timeout set to expire in 30 min 00:00:45.668 [Pipeline] { 00:00:45.683 [Pipeline] stage 00:00:45.686 [Pipeline] { (Tests) 00:00:45.701 [Pipeline] sh 00:00:45.979 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:45.979 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:45.979 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:00:45.979 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:00:45.979 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:45.979 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:45.979 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:00:45.979 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:45.979 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:00:45.979 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:00:45.979 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:45.979 + source /etc/os-release 00:00:45.979 ++ NAME='Fedora Linux' 00:00:45.979 ++ VERSION='38 (Cloud Edition)' 00:00:45.979 ++ ID=fedora 00:00:45.979 ++ VERSION_ID=38 00:00:45.979 ++ VERSION_CODENAME= 00:00:45.979 ++ PLATFORM_ID=platform:f38 00:00:45.979 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:45.979 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:45.979 ++ LOGO=fedora-logo-icon 00:00:45.979 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:45.979 ++ HOME_URL=https://fedoraproject.org/ 00:00:45.979 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:45.979 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:45.979 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:45.979 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:45.979 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:45.979 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:45.979 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:45.979 ++ SUPPORT_END=2024-05-14 00:00:45.979 ++ VARIANT='Cloud Edition' 00:00:45.979 ++ VARIANT_ID=cloud 00:00:45.979 + uname -a 00:00:45.979 Linux spdk-wfp-39 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:00:45.979 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:00:49.264 Hugepages 00:00:49.264 node hugesize free / total 00:00:49.264 node0 1048576kB 0 / 0 00:00:49.264 node0 2048kB 0 / 0 00:00:49.264 node1 1048576kB 0 / 0 00:00:49.264 node1 2048kB 0 / 0 00:00:49.264 00:00:49.264 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:49.264 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:49.264 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:49.264 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:00:49.264 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:49.523 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:49.523 + rm -f /tmp/spdk-ld-path 00:00:49.523 + source autorun-spdk.conf 00:00:49.523 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:49.523 ++ SPDK_TEST_FUZZER_SHORT=1 00:00:49.523 ++ SPDK_TEST_FUZZER=1 00:00:49.523 ++ SPDK_RUN_UBSAN=1 00:00:49.523 ++ RUN_NIGHTLY=0 00:00:49.523 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:49.523 + [[ -n '' ]] 00:00:49.523 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:49.523 + for M in /var/spdk/build-*-manifest.txt 00:00:49.523 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:49.523 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:49.523 + for M in /var/spdk/build-*-manifest.txt 00:00:49.523 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:49.523 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:00:49.523 ++ uname 00:00:49.523 + [[ Linux == \L\i\n\u\x ]] 00:00:49.523 + sudo dmesg -T 00:00:49.523 + sudo dmesg --clear 00:00:49.523 + dmesg_pid=1492022 00:00:49.523 + [[ Fedora Linux == FreeBSD ]] 00:00:49.523 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:49.523 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:49.523 + sudo dmesg -Tw 00:00:49.523 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:49.523 + [[ -x /usr/src/fio-static/fio ]] 00:00:49.523 + export FIO_BIN=/usr/src/fio-static/fio 00:00:49.523 + FIO_BIN=/usr/src/fio-static/fio 00:00:49.523 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:49.523 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:49.523 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:49.523 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:49.523 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:49.523 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:49.523 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:49.523 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:49.523 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:00:49.523 Test configuration: 00:00:49.523 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:49.523 SPDK_TEST_FUZZER_SHORT=1 00:00:49.523 SPDK_TEST_FUZZER=1 00:00:49.523 SPDK_RUN_UBSAN=1 00:00:49.782 RUN_NIGHTLY=0 19:56:34 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:00:49.782 19:56:34 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:49.782 19:56:34 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:49.782 19:56:34 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:49.782 19:56:34 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:49.782 19:56:34 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:49.782 19:56:34 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:49.782 19:56:34 -- paths/export.sh@5 -- $ export PATH 00:00:49.782 19:56:34 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:49.782 19:56:34 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:00:49.782 19:56:34 -- common/autobuild_common.sh@435 -- $ date +%s 00:00:49.782 19:56:34 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714154194.XXXXXX 00:00:49.782 19:56:34 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714154194.QkaJot 00:00:49.782 19:56:34 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:00:49.782 19:56:34 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:00:49.782 19:56:34 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:00:49.782 19:56:34 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:49.782 19:56:34 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:49.782 19:56:34 -- common/autobuild_common.sh@451 -- $ get_config_params 00:00:49.782 19:56:34 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:00:49.782 19:56:34 -- common/autotest_common.sh@10 -- $ set +x 00:00:49.782 19:56:34 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:00:49.782 19:56:34 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:00:49.782 19:56:34 -- pm/common@17 -- $ local monitor 00:00:49.782 19:56:34 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:49.782 19:56:34 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1492058 00:00:49.782 19:56:34 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:49.782 19:56:34 -- pm/common@21 -- $ date +%s 00:00:49.782 19:56:34 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1492061 00:00:49.782 19:56:34 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:49.782 19:56:34 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1492063 00:00:49.782 19:56:34 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:49.782 19:56:34 -- pm/common@21 -- $ date +%s 00:00:49.782 19:56:34 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714154194 00:00:49.782 19:56:34 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1492065 00:00:49.782 19:56:34 -- pm/common@26 -- $ sleep 1 00:00:49.782 19:56:34 -- pm/common@21 -- $ date +%s 00:00:49.782 19:56:34 -- pm/common@21 -- $ date +%s 00:00:49.782 19:56:34 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714154194 00:00:49.782 19:56:34 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714154194 00:00:49.782 19:56:34 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1714154194 00:00:49.782 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714154194_collect-cpu-load.pm.log 00:00:49.782 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714154194_collect-bmc-pm.bmc.pm.log 00:00:49.782 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714154194_collect-vmstat.pm.log 00:00:49.782 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1714154194_collect-cpu-temp.pm.log 00:00:50.719 19:56:35 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:00:50.719 19:56:35 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:50.719 19:56:35 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:50.719 19:56:35 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:50.719 19:56:35 -- spdk/autobuild.sh@16 -- $ date -u 00:00:50.719 Fri Apr 26 05:56:35 PM UTC 2024 00:00:50.719 19:56:35 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:50.720 v24.05-pre-453-g13a9f2aa2 00:00:50.720 19:56:35 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:50.720 19:56:35 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:50.720 19:56:35 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:50.720 19:56:35 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:00:50.720 19:56:35 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:50.720 19:56:35 -- common/autotest_common.sh@10 -- $ set +x 00:00:50.980 ************************************ 00:00:50.980 START TEST ubsan 00:00:50.980 ************************************ 00:00:50.980 19:56:35 -- common/autotest_common.sh@1121 -- $ echo 'using ubsan' 00:00:50.980 using ubsan 00:00:50.980 00:00:50.980 real 0m0.000s 00:00:50.980 user 0m0.000s 00:00:50.980 sys 0m0.000s 00:00:50.980 19:56:35 -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:00:50.980 19:56:35 -- common/autotest_common.sh@10 -- $ set +x 00:00:50.980 ************************************ 00:00:50.980 END TEST ubsan 00:00:50.980 ************************************ 00:00:50.980 19:56:35 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:50.980 19:56:35 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:50.980 19:56:35 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:50.980 19:56:35 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:00:50.980 19:56:35 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:00:50.980 19:56:35 -- common/autobuild_common.sh@423 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:00:50.980 19:56:35 -- common/autotest_common.sh@1097 -- $ '[' 2 -le 1 ']' 00:00:50.980 19:56:35 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:00:50.980 19:56:35 -- common/autotest_common.sh@10 -- $ set +x 00:00:51.240 ************************************ 00:00:51.240 START TEST autobuild_llvm_precompile 00:00:51.240 ************************************ 00:00:51.240 19:56:35 -- common/autotest_common.sh@1121 -- $ _llvm_precompile 00:00:51.240 19:56:35 -- common/autobuild_common.sh@32 -- $ clang --version 00:00:51.240 19:56:35 -- common/autobuild_common.sh@32 -- $ [[ clang version 16.0.6 (Fedora 16.0.6-3.fc38) 00:00:51.240 Target: x86_64-redhat-linux-gnu 00:00:51.240 Thread model: posix 00:00:51.240 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:00:51.240 19:56:35 -- common/autobuild_common.sh@33 -- $ clang_num=16 00:00:51.240 19:56:35 -- common/autobuild_common.sh@35 -- $ export CC=clang-16 00:00:51.240 19:56:35 -- common/autobuild_common.sh@35 -- $ CC=clang-16 00:00:51.240 19:56:35 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-16 00:00:51.240 19:56:35 -- common/autobuild_common.sh@36 -- $ CXX=clang++-16 00:00:51.240 19:56:35 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:00:51.240 19:56:35 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:51.240 19:56:35 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a ]] 00:00:51.240 19:56:35 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a' 00:00:51.240 19:56:35 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:00:51.499 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:00:51.499 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:00:51.757 Using 'verbs' RDMA provider 00:01:07.573 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:19.788 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:19.788 Creating mk/config.mk...done. 00:01:19.789 Creating mk/cc.flags.mk...done. 00:01:19.789 Type 'make' to build. 00:01:19.789 00:01:19.789 real 0m28.423s 00:01:19.789 user 0m12.418s 00:01:19.789 sys 0m15.331s 00:01:19.789 19:57:03 -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:01:19.789 19:57:03 -- common/autotest_common.sh@10 -- $ set +x 00:01:19.789 ************************************ 00:01:19.789 END TEST autobuild_llvm_precompile 00:01:19.789 ************************************ 00:01:19.789 19:57:03 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:19.789 19:57:03 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:19.789 19:57:03 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:19.789 19:57:03 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:19.789 19:57:03 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib64/clang/16/lib/linux/libclang_rt.fuzzer_no_main-x86_64.a 00:01:19.789 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:19.789 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:20.358 Using 'verbs' RDMA provider 00:01:33.623 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:45.830 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:45.830 Creating mk/config.mk...done. 00:01:45.830 Creating mk/cc.flags.mk...done. 00:01:45.830 Type 'make' to build. 00:01:45.830 19:57:29 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:45.830 19:57:29 -- common/autotest_common.sh@1097 -- $ '[' 3 -le 1 ']' 00:01:45.830 19:57:29 -- common/autotest_common.sh@1103 -- $ xtrace_disable 00:01:45.830 19:57:29 -- common/autotest_common.sh@10 -- $ set +x 00:01:45.830 ************************************ 00:01:45.830 START TEST make 00:01:45.830 ************************************ 00:01:45.830 19:57:29 -- common/autotest_common.sh@1121 -- $ make -j72 00:01:46.088 make[1]: Nothing to be done for 'all'. 00:01:48.001 The Meson build system 00:01:48.001 Version: 1.3.1 00:01:48.001 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:01:48.001 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:48.001 Build type: native build 00:01:48.001 Project name: libvfio-user 00:01:48.001 Project version: 0.0.1 00:01:48.001 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:48.001 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:48.001 Host machine cpu family: x86_64 00:01:48.001 Host machine cpu: x86_64 00:01:48.001 Run-time dependency threads found: YES 00:01:48.001 Library dl found: YES 00:01:48.001 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:48.001 Run-time dependency json-c found: YES 0.17 00:01:48.001 Run-time dependency cmocka found: YES 1.1.7 00:01:48.001 Program pytest-3 found: NO 00:01:48.001 Program flake8 found: NO 00:01:48.001 Program misspell-fixer found: NO 00:01:48.001 Program restructuredtext-lint found: NO 00:01:48.001 Program valgrind found: YES (/usr/bin/valgrind) 00:01:48.001 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:48.001 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:48.001 Compiler for C supports arguments -Wwrite-strings: YES 00:01:48.001 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:48.001 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:01:48.001 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:01:48.001 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:01:48.001 Build targets in project: 8 00:01:48.001 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:01:48.001 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:01:48.001 00:01:48.001 libvfio-user 0.0.1 00:01:48.001 00:01:48.001 User defined options 00:01:48.001 buildtype : debug 00:01:48.001 default_library: static 00:01:48.001 libdir : /usr/local/lib 00:01:48.001 00:01:48.001 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:48.001 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:48.260 [1/36] Compiling C object samples/null.p/null.c.o 00:01:48.260 [2/36] Compiling C object samples/lspci.p/lspci.c.o 00:01:48.260 [3/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:01:48.260 [4/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:01:48.260 [5/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:01:48.260 [6/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:01:48.260 [7/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:01:48.260 [8/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:01:48.260 [9/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:01:48.260 [10/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:01:48.260 [11/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:01:48.260 [12/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:01:48.260 [13/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:01:48.260 [14/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:01:48.260 [15/36] Compiling C object test/unit_tests.p/mocks.c.o 00:01:48.260 [16/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:01:48.260 [17/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:01:48.260 [18/36] Compiling C object samples/server.p/server.c.o 00:01:48.260 [19/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:01:48.260 [20/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:01:48.260 [21/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:01:48.260 [22/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:01:48.260 [23/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:01:48.260 [24/36] Compiling C object samples/client.p/client.c.o 00:01:48.260 [25/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:01:48.260 [26/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:01:48.260 [27/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:01:48.260 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:01:48.260 [29/36] Linking static target lib/libvfio-user.a 00:01:48.260 [30/36] Linking target samples/client 00:01:48.260 [31/36] Linking target samples/null 00:01:48.260 [32/36] Linking target samples/gpio-pci-idio-16 00:01:48.260 [33/36] Linking target test/unit_tests 00:01:48.260 [34/36] Linking target samples/lspci 00:01:48.260 [35/36] Linking target samples/server 00:01:48.260 [36/36] Linking target samples/shadow_ioeventfd_server 00:01:48.260 INFO: autodetecting backend as ninja 00:01:48.260 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:48.260 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:01:48.518 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:01:48.518 ninja: no work to do. 00:01:55.090 The Meson build system 00:01:55.090 Version: 1.3.1 00:01:55.090 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:01:55.090 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:01:55.090 Build type: native build 00:01:55.090 Program cat found: YES (/usr/bin/cat) 00:01:55.090 Project name: DPDK 00:01:55.090 Project version: 23.11.0 00:01:55.090 C compiler for the host machine: clang-16 (clang 16.0.6 "clang version 16.0.6 (Fedora 16.0.6-3.fc38)") 00:01:55.090 C linker for the host machine: clang-16 ld.bfd 2.39-16 00:01:55.090 Host machine cpu family: x86_64 00:01:55.090 Host machine cpu: x86_64 00:01:55.090 Message: ## Building in Developer Mode ## 00:01:55.090 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:55.090 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:55.090 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:55.090 Program python3 found: YES (/usr/bin/python3) 00:01:55.090 Program cat found: YES (/usr/bin/cat) 00:01:55.090 Compiler for C supports arguments -march=native: YES 00:01:55.090 Checking for size of "void *" : 8 00:01:55.090 Checking for size of "void *" : 8 (cached) 00:01:55.090 Library m found: YES 00:01:55.090 Library numa found: YES 00:01:55.090 Has header "numaif.h" : YES 00:01:55.090 Library fdt found: NO 00:01:55.090 Library execinfo found: NO 00:01:55.090 Has header "execinfo.h" : YES 00:01:55.090 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:55.090 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:55.090 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:55.090 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:55.090 Run-time dependency openssl found: YES 3.0.9 00:01:55.090 Run-time dependency libpcap found: YES 1.10.4 00:01:55.090 Has header "pcap.h" with dependency libpcap: YES 00:01:55.090 Compiler for C supports arguments -Wcast-qual: YES 00:01:55.090 Compiler for C supports arguments -Wdeprecated: YES 00:01:55.090 Compiler for C supports arguments -Wformat: YES 00:01:55.090 Compiler for C supports arguments -Wformat-nonliteral: YES 00:01:55.090 Compiler for C supports arguments -Wformat-security: YES 00:01:55.090 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:55.090 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:55.090 Compiler for C supports arguments -Wnested-externs: YES 00:01:55.090 Compiler for C supports arguments -Wold-style-definition: YES 00:01:55.090 Compiler for C supports arguments -Wpointer-arith: YES 00:01:55.090 Compiler for C supports arguments -Wsign-compare: YES 00:01:55.090 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:55.090 Compiler for C supports arguments -Wundef: YES 00:01:55.090 Compiler for C supports arguments -Wwrite-strings: YES 00:01:55.090 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:55.090 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:01:55.090 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:55.090 Program objdump found: YES (/usr/bin/objdump) 00:01:55.090 Compiler for C supports arguments -mavx512f: YES 00:01:55.090 Checking if "AVX512 checking" compiles: YES 00:01:55.090 Fetching value of define "__SSE4_2__" : 1 00:01:55.090 Fetching value of define "__AES__" : 1 00:01:55.090 Fetching value of define "__AVX__" : 1 00:01:55.090 Fetching value of define "__AVX2__" : 1 00:01:55.090 Fetching value of define "__AVX512BW__" : 1 00:01:55.090 Fetching value of define "__AVX512CD__" : 1 00:01:55.090 Fetching value of define "__AVX512DQ__" : 1 00:01:55.090 Fetching value of define "__AVX512F__" : 1 00:01:55.090 Fetching value of define "__AVX512VL__" : 1 00:01:55.090 Fetching value of define "__PCLMUL__" : 1 00:01:55.090 Fetching value of define "__RDRND__" : 1 00:01:55.090 Fetching value of define "__RDSEED__" : 1 00:01:55.090 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:55.090 Fetching value of define "__znver1__" : (undefined) 00:01:55.090 Fetching value of define "__znver2__" : (undefined) 00:01:55.090 Fetching value of define "__znver3__" : (undefined) 00:01:55.090 Fetching value of define "__znver4__" : (undefined) 00:01:55.090 Compiler for C supports arguments -Wno-format-truncation: NO 00:01:55.090 Message: lib/log: Defining dependency "log" 00:01:55.090 Message: lib/kvargs: Defining dependency "kvargs" 00:01:55.090 Message: lib/telemetry: Defining dependency "telemetry" 00:01:55.090 Checking for function "getentropy" : NO 00:01:55.090 Message: lib/eal: Defining dependency "eal" 00:01:55.090 Message: lib/ring: Defining dependency "ring" 00:01:55.090 Message: lib/rcu: Defining dependency "rcu" 00:01:55.090 Message: lib/mempool: Defining dependency "mempool" 00:01:55.090 Message: lib/mbuf: Defining dependency "mbuf" 00:01:55.090 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:55.090 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:55.090 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:55.090 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:55.090 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:55.090 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:55.090 Compiler for C supports arguments -mpclmul: YES 00:01:55.090 Compiler for C supports arguments -maes: YES 00:01:55.090 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:55.090 Compiler for C supports arguments -mavx512bw: YES 00:01:55.090 Compiler for C supports arguments -mavx512dq: YES 00:01:55.091 Compiler for C supports arguments -mavx512vl: YES 00:01:55.091 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:55.091 Compiler for C supports arguments -mavx2: YES 00:01:55.091 Compiler for C supports arguments -mavx: YES 00:01:55.091 Message: lib/net: Defining dependency "net" 00:01:55.091 Message: lib/meter: Defining dependency "meter" 00:01:55.091 Message: lib/ethdev: Defining dependency "ethdev" 00:01:55.091 Message: lib/pci: Defining dependency "pci" 00:01:55.091 Message: lib/cmdline: Defining dependency "cmdline" 00:01:55.091 Message: lib/hash: Defining dependency "hash" 00:01:55.091 Message: lib/timer: Defining dependency "timer" 00:01:55.091 Message: lib/compressdev: Defining dependency "compressdev" 00:01:55.091 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:55.091 Message: lib/dmadev: Defining dependency "dmadev" 00:01:55.091 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:55.091 Message: lib/power: Defining dependency "power" 00:01:55.091 Message: lib/reorder: Defining dependency "reorder" 00:01:55.091 Message: lib/security: Defining dependency "security" 00:01:55.091 Has header "linux/userfaultfd.h" : YES 00:01:55.091 Has header "linux/vduse.h" : YES 00:01:55.091 Message: lib/vhost: Defining dependency "vhost" 00:01:55.091 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:01:55.091 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:55.091 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:55.091 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:55.091 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:55.091 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:55.091 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:55.091 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:55.091 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:55.091 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:55.091 Program doxygen found: YES (/usr/bin/doxygen) 00:01:55.091 Configuring doxy-api-html.conf using configuration 00:01:55.091 Configuring doxy-api-man.conf using configuration 00:01:55.091 Program mandb found: YES (/usr/bin/mandb) 00:01:55.091 Program sphinx-build found: NO 00:01:55.091 Configuring rte_build_config.h using configuration 00:01:55.091 Message: 00:01:55.091 ================= 00:01:55.091 Applications Enabled 00:01:55.091 ================= 00:01:55.091 00:01:55.091 apps: 00:01:55.091 00:01:55.091 00:01:55.091 Message: 00:01:55.091 ================= 00:01:55.091 Libraries Enabled 00:01:55.091 ================= 00:01:55.091 00:01:55.091 libs: 00:01:55.091 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:55.091 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:55.091 cryptodev, dmadev, power, reorder, security, vhost, 00:01:55.091 00:01:55.091 Message: 00:01:55.091 =============== 00:01:55.091 Drivers Enabled 00:01:55.091 =============== 00:01:55.091 00:01:55.091 common: 00:01:55.091 00:01:55.091 bus: 00:01:55.091 pci, vdev, 00:01:55.091 mempool: 00:01:55.091 ring, 00:01:55.091 dma: 00:01:55.091 00:01:55.091 net: 00:01:55.091 00:01:55.091 crypto: 00:01:55.091 00:01:55.091 compress: 00:01:55.091 00:01:55.091 vdpa: 00:01:55.091 00:01:55.091 00:01:55.091 Message: 00:01:55.091 ================= 00:01:55.091 Content Skipped 00:01:55.091 ================= 00:01:55.091 00:01:55.091 apps: 00:01:55.091 dumpcap: explicitly disabled via build config 00:01:55.091 graph: explicitly disabled via build config 00:01:55.091 pdump: explicitly disabled via build config 00:01:55.091 proc-info: explicitly disabled via build config 00:01:55.091 test-acl: explicitly disabled via build config 00:01:55.091 test-bbdev: explicitly disabled via build config 00:01:55.091 test-cmdline: explicitly disabled via build config 00:01:55.091 test-compress-perf: explicitly disabled via build config 00:01:55.091 test-crypto-perf: explicitly disabled via build config 00:01:55.091 test-dma-perf: explicitly disabled via build config 00:01:55.091 test-eventdev: explicitly disabled via build config 00:01:55.091 test-fib: explicitly disabled via build config 00:01:55.091 test-flow-perf: explicitly disabled via build config 00:01:55.091 test-gpudev: explicitly disabled via build config 00:01:55.091 test-mldev: explicitly disabled via build config 00:01:55.091 test-pipeline: explicitly disabled via build config 00:01:55.091 test-pmd: explicitly disabled via build config 00:01:55.091 test-regex: explicitly disabled via build config 00:01:55.091 test-sad: explicitly disabled via build config 00:01:55.091 test-security-perf: explicitly disabled via build config 00:01:55.091 00:01:55.091 libs: 00:01:55.091 metrics: explicitly disabled via build config 00:01:55.091 acl: explicitly disabled via build config 00:01:55.091 bbdev: explicitly disabled via build config 00:01:55.091 bitratestats: explicitly disabled via build config 00:01:55.091 bpf: explicitly disabled via build config 00:01:55.091 cfgfile: explicitly disabled via build config 00:01:55.091 distributor: explicitly disabled via build config 00:01:55.091 efd: explicitly disabled via build config 00:01:55.091 eventdev: explicitly disabled via build config 00:01:55.091 dispatcher: explicitly disabled via build config 00:01:55.091 gpudev: explicitly disabled via build config 00:01:55.091 gro: explicitly disabled via build config 00:01:55.091 gso: explicitly disabled via build config 00:01:55.091 ip_frag: explicitly disabled via build config 00:01:55.091 jobstats: explicitly disabled via build config 00:01:55.091 latencystats: explicitly disabled via build config 00:01:55.091 lpm: explicitly disabled via build config 00:01:55.091 member: explicitly disabled via build config 00:01:55.091 pcapng: explicitly disabled via build config 00:01:55.091 rawdev: explicitly disabled via build config 00:01:55.091 regexdev: explicitly disabled via build config 00:01:55.091 mldev: explicitly disabled via build config 00:01:55.091 rib: explicitly disabled via build config 00:01:55.091 sched: explicitly disabled via build config 00:01:55.091 stack: explicitly disabled via build config 00:01:55.091 ipsec: explicitly disabled via build config 00:01:55.091 pdcp: explicitly disabled via build config 00:01:55.091 fib: explicitly disabled via build config 00:01:55.091 port: explicitly disabled via build config 00:01:55.091 pdump: explicitly disabled via build config 00:01:55.091 table: explicitly disabled via build config 00:01:55.091 pipeline: explicitly disabled via build config 00:01:55.091 graph: explicitly disabled via build config 00:01:55.091 node: explicitly disabled via build config 00:01:55.091 00:01:55.091 drivers: 00:01:55.091 common/cpt: not in enabled drivers build config 00:01:55.091 common/dpaax: not in enabled drivers build config 00:01:55.091 common/iavf: not in enabled drivers build config 00:01:55.091 common/idpf: not in enabled drivers build config 00:01:55.091 common/mvep: not in enabled drivers build config 00:01:55.091 common/octeontx: not in enabled drivers build config 00:01:55.091 bus/auxiliary: not in enabled drivers build config 00:01:55.091 bus/cdx: not in enabled drivers build config 00:01:55.091 bus/dpaa: not in enabled drivers build config 00:01:55.091 bus/fslmc: not in enabled drivers build config 00:01:55.091 bus/ifpga: not in enabled drivers build config 00:01:55.091 bus/platform: not in enabled drivers build config 00:01:55.091 bus/vmbus: not in enabled drivers build config 00:01:55.091 common/cnxk: not in enabled drivers build config 00:01:55.091 common/mlx5: not in enabled drivers build config 00:01:55.091 common/nfp: not in enabled drivers build config 00:01:55.091 common/qat: not in enabled drivers build config 00:01:55.091 common/sfc_efx: not in enabled drivers build config 00:01:55.091 mempool/bucket: not in enabled drivers build config 00:01:55.091 mempool/cnxk: not in enabled drivers build config 00:01:55.091 mempool/dpaa: not in enabled drivers build config 00:01:55.091 mempool/dpaa2: not in enabled drivers build config 00:01:55.091 mempool/octeontx: not in enabled drivers build config 00:01:55.091 mempool/stack: not in enabled drivers build config 00:01:55.091 dma/cnxk: not in enabled drivers build config 00:01:55.091 dma/dpaa: not in enabled drivers build config 00:01:55.091 dma/dpaa2: not in enabled drivers build config 00:01:55.091 dma/hisilicon: not in enabled drivers build config 00:01:55.091 dma/idxd: not in enabled drivers build config 00:01:55.091 dma/ioat: not in enabled drivers build config 00:01:55.091 dma/skeleton: not in enabled drivers build config 00:01:55.091 net/af_packet: not in enabled drivers build config 00:01:55.091 net/af_xdp: not in enabled drivers build config 00:01:55.091 net/ark: not in enabled drivers build config 00:01:55.091 net/atlantic: not in enabled drivers build config 00:01:55.091 net/avp: not in enabled drivers build config 00:01:55.091 net/axgbe: not in enabled drivers build config 00:01:55.091 net/bnx2x: not in enabled drivers build config 00:01:55.091 net/bnxt: not in enabled drivers build config 00:01:55.091 net/bonding: not in enabled drivers build config 00:01:55.091 net/cnxk: not in enabled drivers build config 00:01:55.091 net/cpfl: not in enabled drivers build config 00:01:55.091 net/cxgbe: not in enabled drivers build config 00:01:55.091 net/dpaa: not in enabled drivers build config 00:01:55.091 net/dpaa2: not in enabled drivers build config 00:01:55.091 net/e1000: not in enabled drivers build config 00:01:55.091 net/ena: not in enabled drivers build config 00:01:55.091 net/enetc: not in enabled drivers build config 00:01:55.091 net/enetfec: not in enabled drivers build config 00:01:55.091 net/enic: not in enabled drivers build config 00:01:55.092 net/failsafe: not in enabled drivers build config 00:01:55.092 net/fm10k: not in enabled drivers build config 00:01:55.092 net/gve: not in enabled drivers build config 00:01:55.092 net/hinic: not in enabled drivers build config 00:01:55.092 net/hns3: not in enabled drivers build config 00:01:55.092 net/i40e: not in enabled drivers build config 00:01:55.092 net/iavf: not in enabled drivers build config 00:01:55.092 net/ice: not in enabled drivers build config 00:01:55.092 net/idpf: not in enabled drivers build config 00:01:55.092 net/igc: not in enabled drivers build config 00:01:55.092 net/ionic: not in enabled drivers build config 00:01:55.092 net/ipn3ke: not in enabled drivers build config 00:01:55.092 net/ixgbe: not in enabled drivers build config 00:01:55.092 net/mana: not in enabled drivers build config 00:01:55.092 net/memif: not in enabled drivers build config 00:01:55.092 net/mlx4: not in enabled drivers build config 00:01:55.092 net/mlx5: not in enabled drivers build config 00:01:55.092 net/mvneta: not in enabled drivers build config 00:01:55.092 net/mvpp2: not in enabled drivers build config 00:01:55.092 net/netvsc: not in enabled drivers build config 00:01:55.092 net/nfb: not in enabled drivers build config 00:01:55.092 net/nfp: not in enabled drivers build config 00:01:55.092 net/ngbe: not in enabled drivers build config 00:01:55.092 net/null: not in enabled drivers build config 00:01:55.092 net/octeontx: not in enabled drivers build config 00:01:55.092 net/octeon_ep: not in enabled drivers build config 00:01:55.092 net/pcap: not in enabled drivers build config 00:01:55.092 net/pfe: not in enabled drivers build config 00:01:55.092 net/qede: not in enabled drivers build config 00:01:55.092 net/ring: not in enabled drivers build config 00:01:55.092 net/sfc: not in enabled drivers build config 00:01:55.092 net/softnic: not in enabled drivers build config 00:01:55.092 net/tap: not in enabled drivers build config 00:01:55.092 net/thunderx: not in enabled drivers build config 00:01:55.092 net/txgbe: not in enabled drivers build config 00:01:55.092 net/vdev_netvsc: not in enabled drivers build config 00:01:55.092 net/vhost: not in enabled drivers build config 00:01:55.092 net/virtio: not in enabled drivers build config 00:01:55.092 net/vmxnet3: not in enabled drivers build config 00:01:55.092 raw/*: missing internal dependency, "rawdev" 00:01:55.092 crypto/armv8: not in enabled drivers build config 00:01:55.092 crypto/bcmfs: not in enabled drivers build config 00:01:55.092 crypto/caam_jr: not in enabled drivers build config 00:01:55.092 crypto/ccp: not in enabled drivers build config 00:01:55.092 crypto/cnxk: not in enabled drivers build config 00:01:55.092 crypto/dpaa_sec: not in enabled drivers build config 00:01:55.092 crypto/dpaa2_sec: not in enabled drivers build config 00:01:55.092 crypto/ipsec_mb: not in enabled drivers build config 00:01:55.092 crypto/mlx5: not in enabled drivers build config 00:01:55.092 crypto/mvsam: not in enabled drivers build config 00:01:55.092 crypto/nitrox: not in enabled drivers build config 00:01:55.092 crypto/null: not in enabled drivers build config 00:01:55.092 crypto/octeontx: not in enabled drivers build config 00:01:55.092 crypto/openssl: not in enabled drivers build config 00:01:55.092 crypto/scheduler: not in enabled drivers build config 00:01:55.092 crypto/uadk: not in enabled drivers build config 00:01:55.092 crypto/virtio: not in enabled drivers build config 00:01:55.092 compress/isal: not in enabled drivers build config 00:01:55.092 compress/mlx5: not in enabled drivers build config 00:01:55.092 compress/octeontx: not in enabled drivers build config 00:01:55.092 compress/zlib: not in enabled drivers build config 00:01:55.092 regex/*: missing internal dependency, "regexdev" 00:01:55.092 ml/*: missing internal dependency, "mldev" 00:01:55.092 vdpa/ifc: not in enabled drivers build config 00:01:55.092 vdpa/mlx5: not in enabled drivers build config 00:01:55.092 vdpa/nfp: not in enabled drivers build config 00:01:55.092 vdpa/sfc: not in enabled drivers build config 00:01:55.092 event/*: missing internal dependency, "eventdev" 00:01:55.092 baseband/*: missing internal dependency, "bbdev" 00:01:55.092 gpu/*: missing internal dependency, "gpudev" 00:01:55.092 00:01:55.092 00:01:55.092 Build targets in project: 85 00:01:55.092 00:01:55.092 DPDK 23.11.0 00:01:55.092 00:01:55.092 User defined options 00:01:55.092 buildtype : debug 00:01:55.092 default_library : static 00:01:55.092 libdir : lib 00:01:55.092 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:55.092 c_args : -fPIC -Werror 00:01:55.092 c_link_args : 00:01:55.092 cpu_instruction_set: native 00:01:55.092 disable_apps : test-acl,test-bbdev,test-crypto-perf,test-fib,test-pipeline,test-gpudev,test-flow-perf,pdump,dumpcap,test-sad,test-cmdline,test-eventdev,proc-info,test,test-dma-perf,test-pmd,test-mldev,test-compress-perf,test-security-perf,graph,test-regex 00:01:55.092 disable_libs : pipeline,member,eventdev,efd,bbdev,cfgfile,rib,sched,mldev,metrics,lpm,latencystats,pdump,pdcp,bpf,ipsec,fib,ip_frag,table,port,stack,gro,jobstats,regexdev,rawdev,pcapng,dispatcher,node,bitratestats,acl,gpudev,distributor,graph,gso 00:01:55.092 enable_docs : false 00:01:55.092 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:01:55.092 enable_kmods : false 00:01:55.092 tests : false 00:01:55.092 00:01:55.092 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:55.092 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:01:55.092 [1/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:55.092 [2/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:55.092 [3/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:55.092 [4/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:55.092 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:55.092 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:55.092 [7/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:55.092 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:55.092 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:55.092 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:55.092 [11/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:55.092 [12/265] Linking static target lib/librte_kvargs.a 00:01:55.092 [13/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:55.092 [14/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:55.092 [15/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:55.092 [16/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:55.092 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:55.092 [18/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:55.092 [19/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:55.092 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:55.092 [21/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:55.092 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:55.092 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:55.092 [24/265] Linking static target lib/librte_log.a 00:01:55.092 [25/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:55.352 [26/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.352 [27/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:55.352 [28/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:55.352 [29/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:55.352 [30/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:55.352 [31/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:55.352 [32/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:55.352 [33/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:55.352 [34/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:55.352 [35/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:55.352 [36/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:55.352 [37/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:55.352 [38/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:55.352 [39/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:55.352 [40/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:55.352 [41/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:55.352 [42/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:55.352 [43/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:55.352 [44/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:55.352 [45/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:55.352 [46/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:55.352 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:55.352 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:55.352 [49/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:55.352 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:55.352 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:55.352 [52/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:55.352 [53/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:55.352 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:55.352 [55/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:55.352 [56/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:55.352 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:55.352 [58/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:55.352 [59/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:55.352 [60/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:55.352 [61/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:55.352 [62/265] Linking static target lib/librte_telemetry.a 00:01:55.352 [63/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:55.612 [64/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:55.612 [65/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:55.612 [66/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:55.612 [67/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:55.612 [68/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:55.612 [69/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:55.612 [70/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:55.612 [71/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:55.612 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:55.612 [73/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:55.612 [74/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:55.612 [75/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:55.612 [76/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:55.612 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:55.612 [78/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:55.612 [79/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:55.612 [80/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:55.612 [81/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:55.612 [82/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:55.612 [83/265] Linking static target lib/librte_ring.a 00:01:55.612 [84/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:55.612 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:55.612 [86/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:55.612 [87/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:55.612 [88/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:55.612 [89/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:55.612 [90/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:55.612 [91/265] Linking static target lib/librte_pci.a 00:01:55.612 [92/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:55.612 [93/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:55.612 [94/265] Linking static target lib/librte_meter.a 00:01:55.612 [95/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:55.612 [96/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:55.612 [97/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:55.612 [98/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:55.612 [99/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:55.612 [100/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:55.612 [101/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:55.612 [102/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:55.612 [103/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:55.612 [104/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:55.612 [105/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:55.612 [106/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:55.612 [107/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:55.612 [108/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:55.612 [109/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:55.612 [110/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:55.612 [111/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:55.612 [112/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:55.612 [113/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:55.612 [114/265] Linking static target lib/librte_mempool.a 00:01:55.612 [115/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:55.612 [116/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.612 [117/265] Linking static target lib/librte_rcu.a 00:01:55.612 [118/265] Linking static target lib/librte_net.a 00:01:55.612 [119/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:55.612 [120/265] Linking static target lib/librte_eal.a 00:01:55.612 [121/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:55.612 [122/265] Linking target lib/librte_log.so.24.0 00:01:55.878 [123/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:55.878 [124/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.878 [125/265] Linking static target lib/librte_mbuf.a 00:01:55.879 [126/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.879 [127/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.879 [128/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:01:55.879 [129/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.879 [130/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.879 [131/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:55.879 [132/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:55.879 [133/265] Linking target lib/librte_kvargs.so.24.0 00:01:55.879 [134/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:56.138 [135/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:56.138 [136/265] Linking target lib/librte_telemetry.so.24.0 00:01:56.138 [137/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:56.138 [138/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:56.138 [139/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:56.138 [140/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:56.138 [141/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:56.138 [142/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:56.138 [143/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:56.138 [144/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:56.138 [145/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:56.138 [146/265] Linking static target lib/librte_timer.a 00:01:56.138 [147/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:56.138 [148/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:56.138 [149/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:56.138 [150/265] Linking static target lib/librte_cmdline.a 00:01:56.138 [151/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:56.138 [152/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:56.138 [153/265] Linking static target lib/librte_dmadev.a 00:01:56.138 [154/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:56.138 [155/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:56.138 [156/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:56.138 [157/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:56.138 [158/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:56.138 [159/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:56.138 [160/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:01:56.138 [161/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:56.138 [162/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:56.138 [163/265] Linking static target lib/librte_power.a 00:01:56.138 [164/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:56.138 [165/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:56.138 [166/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:56.138 [167/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:01:56.138 [168/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:56.138 [169/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:56.138 [170/265] Linking static target lib/librte_compressdev.a 00:01:56.138 [171/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:56.138 [172/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:56.138 [173/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:56.138 [174/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:56.138 [175/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:56.138 [176/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:56.138 [177/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:56.138 [178/265] Linking static target lib/librte_reorder.a 00:01:56.138 [179/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:56.138 [180/265] Linking static target lib/librte_hash.a 00:01:56.138 [181/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:56.138 [182/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:56.138 [183/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:56.138 [184/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:56.138 [185/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:56.138 [186/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:56.138 [187/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:56.138 [188/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:56.138 [189/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:56.138 [190/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:56.396 [191/265] Linking static target lib/librte_security.a 00:01:56.396 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:56.396 [193/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:56.396 [194/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.396 [195/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:56.396 [196/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:56.396 [197/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:56.396 [198/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:56.396 [199/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:56.396 [200/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:56.396 [201/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:56.396 [202/265] Linking static target drivers/librte_bus_vdev.a 00:01:56.396 [203/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:56.396 [204/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:56.396 [205/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:56.396 [206/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.396 [207/265] Linking static target drivers/librte_bus_pci.a 00:01:56.396 [208/265] Linking static target lib/librte_cryptodev.a 00:01:56.396 [209/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.396 [210/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:56.396 [211/265] Linking static target drivers/librte_mempool_ring.a 00:01:56.396 [212/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.654 [213/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.654 [214/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:56.654 [215/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.654 [216/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:56.654 [217/265] Linking static target lib/librte_ethdev.a 00:01:56.912 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.912 [219/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:56.912 [220/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.169 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.169 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.169 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:57.169 [224/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:57.428 [225/265] Linking static target lib/librte_vhost.a 00:01:57.428 [226/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:58.802 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:59.368 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:05.984 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.517 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:08.517 [231/265] Linking target lib/librte_eal.so.24.0 00:02:08.517 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:08.517 [233/265] Linking target lib/librte_ring.so.24.0 00:02:08.517 [234/265] Linking target lib/librte_timer.so.24.0 00:02:08.517 [235/265] Linking target lib/librte_pci.so.24.0 00:02:08.517 [236/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:08.517 [237/265] Linking target lib/librte_meter.so.24.0 00:02:08.517 [238/265] Linking target lib/librte_dmadev.so.24.0 00:02:08.517 [239/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:08.517 [240/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:08.517 [241/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:08.517 [242/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:08.517 [243/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:08.517 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:08.517 [245/265] Linking target lib/librte_rcu.so.24.0 00:02:08.517 [246/265] Linking target lib/librte_mempool.so.24.0 00:02:08.776 [247/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:08.776 [248/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:08.776 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:08.776 [250/265] Linking target lib/librte_mbuf.so.24.0 00:02:09.035 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:09.035 [252/265] Linking target lib/librte_net.so.24.0 00:02:09.035 [253/265] Linking target lib/librte_cryptodev.so.24.0 00:02:09.035 [254/265] Linking target lib/librte_compressdev.so.24.0 00:02:09.035 [255/265] Linking target lib/librte_reorder.so.24.0 00:02:09.035 [256/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:09.293 [257/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:09.293 [258/265] Linking target lib/librte_security.so.24.0 00:02:09.293 [259/265] Linking target lib/librte_hash.so.24.0 00:02:09.293 [260/265] Linking target lib/librte_cmdline.so.24.0 00:02:09.293 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:09.293 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:09.293 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:09.551 [264/265] Linking target lib/librte_power.so.24.0 00:02:09.551 [265/265] Linking target lib/librte_vhost.so.24.0 00:02:09.551 INFO: autodetecting backend as ninja 00:02:09.551 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:10.486 CC lib/ut_mock/mock.o 00:02:10.486 CC lib/log/log.o 00:02:10.486 CC lib/log/log_flags.o 00:02:10.486 CC lib/log/log_deprecated.o 00:02:10.486 CC lib/ut/ut.o 00:02:10.486 LIB libspdk_ut_mock.a 00:02:10.486 LIB libspdk_log.a 00:02:10.486 LIB libspdk_ut.a 00:02:10.745 CC lib/util/base64.o 00:02:10.745 CC lib/util/cpuset.o 00:02:10.745 CC lib/util/bit_array.o 00:02:10.745 CC lib/util/crc16.o 00:02:10.745 CC lib/util/crc32.o 00:02:10.745 CC lib/util/crc32c.o 00:02:10.745 CC lib/util/crc32_ieee.o 00:02:10.745 CC lib/util/crc64.o 00:02:10.745 CC lib/util/dif.o 00:02:10.745 CC lib/util/fd.o 00:02:10.745 CC lib/util/file.o 00:02:10.745 CC lib/util/hexlify.o 00:02:10.745 CC lib/util/iov.o 00:02:10.745 CC lib/util/pipe.o 00:02:10.745 CC lib/util/math.o 00:02:10.745 CC lib/ioat/ioat.o 00:02:10.745 CC lib/util/string.o 00:02:10.745 CC lib/util/strerror_tls.o 00:02:10.745 CC lib/util/uuid.o 00:02:10.745 CC lib/util/fd_group.o 00:02:10.745 CC lib/util/xor.o 00:02:10.745 CC lib/util/zipf.o 00:02:10.745 CC lib/dma/dma.o 00:02:10.745 CXX lib/trace_parser/trace.o 00:02:11.003 CC lib/vfio_user/host/vfio_user_pci.o 00:02:11.003 CC lib/vfio_user/host/vfio_user.o 00:02:11.003 LIB libspdk_dma.a 00:02:11.003 LIB libspdk_ioat.a 00:02:11.003 LIB libspdk_vfio_user.a 00:02:11.260 LIB libspdk_util.a 00:02:11.260 LIB libspdk_trace_parser.a 00:02:11.518 CC lib/vmd/vmd.o 00:02:11.518 CC lib/vmd/led.o 00:02:11.518 CC lib/json/json_parse.o 00:02:11.518 CC lib/json/json_util.o 00:02:11.518 CC lib/json/json_write.o 00:02:11.518 CC lib/idxd/idxd.o 00:02:11.518 CC lib/idxd/idxd_user.o 00:02:11.518 CC lib/conf/conf.o 00:02:11.518 CC lib/env_dpdk/env.o 00:02:11.518 CC lib/env_dpdk/memory.o 00:02:11.518 CC lib/env_dpdk/pci.o 00:02:11.518 CC lib/env_dpdk/init.o 00:02:11.518 CC lib/rdma/common.o 00:02:11.518 CC lib/env_dpdk/threads.o 00:02:11.518 CC lib/env_dpdk/pci_vmd.o 00:02:11.518 CC lib/rdma/rdma_verbs.o 00:02:11.518 CC lib/env_dpdk/pci_ioat.o 00:02:11.518 CC lib/env_dpdk/pci_virtio.o 00:02:11.518 CC lib/env_dpdk/pci_event.o 00:02:11.518 CC lib/env_dpdk/pci_idxd.o 00:02:11.518 CC lib/env_dpdk/pci_dpdk.o 00:02:11.518 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:11.518 CC lib/env_dpdk/sigbus_handler.o 00:02:11.518 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:11.777 LIB libspdk_conf.a 00:02:11.777 LIB libspdk_json.a 00:02:11.777 LIB libspdk_rdma.a 00:02:11.777 LIB libspdk_idxd.a 00:02:11.777 LIB libspdk_vmd.a 00:02:12.035 CC lib/jsonrpc/jsonrpc_server.o 00:02:12.035 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:12.035 CC lib/jsonrpc/jsonrpc_client.o 00:02:12.035 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:12.035 LIB libspdk_jsonrpc.a 00:02:12.601 CC lib/rpc/rpc.o 00:02:12.601 LIB libspdk_env_dpdk.a 00:02:12.601 LIB libspdk_rpc.a 00:02:12.859 CC lib/keyring/keyring.o 00:02:12.859 CC lib/keyring/keyring_rpc.o 00:02:12.859 CC lib/trace/trace.o 00:02:12.859 CC lib/trace/trace_rpc.o 00:02:12.859 CC lib/trace/trace_flags.o 00:02:12.859 CC lib/notify/notify.o 00:02:12.859 CC lib/notify/notify_rpc.o 00:02:13.118 LIB libspdk_notify.a 00:02:13.118 LIB libspdk_keyring.a 00:02:13.118 LIB libspdk_trace.a 00:02:13.377 CC lib/sock/sock.o 00:02:13.377 CC lib/sock/sock_rpc.o 00:02:13.377 CC lib/thread/iobuf.o 00:02:13.377 CC lib/thread/thread.o 00:02:13.635 LIB libspdk_sock.a 00:02:13.893 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:13.893 CC lib/nvme/nvme_fabric.o 00:02:13.893 CC lib/nvme/nvme_ctrlr.o 00:02:13.893 CC lib/nvme/nvme_ns.o 00:02:13.893 CC lib/nvme/nvme_ns_cmd.o 00:02:13.893 CC lib/nvme/nvme_pcie_common.o 00:02:13.893 CC lib/nvme/nvme_qpair.o 00:02:13.893 CC lib/nvme/nvme_pcie.o 00:02:13.893 CC lib/nvme/nvme_quirks.o 00:02:13.893 CC lib/nvme/nvme_transport.o 00:02:13.893 CC lib/nvme/nvme.o 00:02:13.893 CC lib/nvme/nvme_discovery.o 00:02:13.893 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:13.893 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:13.893 CC lib/nvme/nvme_opal.o 00:02:13.893 CC lib/nvme/nvme_tcp.o 00:02:13.893 CC lib/nvme/nvme_poll_group.o 00:02:13.893 CC lib/nvme/nvme_io_msg.o 00:02:13.893 CC lib/nvme/nvme_stubs.o 00:02:13.893 CC lib/nvme/nvme_zns.o 00:02:13.893 CC lib/nvme/nvme_cuse.o 00:02:13.893 CC lib/nvme/nvme_auth.o 00:02:13.893 CC lib/nvme/nvme_vfio_user.o 00:02:13.893 CC lib/nvme/nvme_rdma.o 00:02:14.151 LIB libspdk_thread.a 00:02:14.410 CC lib/init/subsystem_rpc.o 00:02:14.410 CC lib/init/subsystem.o 00:02:14.410 CC lib/init/json_config.o 00:02:14.410 CC lib/init/rpc.o 00:02:14.410 CC lib/blob/blobstore.o 00:02:14.410 CC lib/virtio/virtio.o 00:02:14.410 CC lib/blob/zeroes.o 00:02:14.410 CC lib/blob/request.o 00:02:14.410 CC lib/virtio/virtio_vhost_user.o 00:02:14.410 CC lib/virtio/virtio_pci.o 00:02:14.410 CC lib/virtio/virtio_vfio_user.o 00:02:14.410 CC lib/blob/blob_bs_dev.o 00:02:14.410 CC lib/accel/accel.o 00:02:14.410 CC lib/accel/accel_sw.o 00:02:14.410 CC lib/accel/accel_rpc.o 00:02:14.410 CC lib/vfu_tgt/tgt_endpoint.o 00:02:14.410 CC lib/vfu_tgt/tgt_rpc.o 00:02:14.668 LIB libspdk_init.a 00:02:14.668 LIB libspdk_virtio.a 00:02:14.668 LIB libspdk_vfu_tgt.a 00:02:14.926 CC lib/event/app.o 00:02:14.926 CC lib/event/app_rpc.o 00:02:14.926 CC lib/event/reactor.o 00:02:14.926 CC lib/event/log_rpc.o 00:02:14.926 CC lib/event/scheduler_static.o 00:02:15.184 LIB libspdk_event.a 00:02:15.184 LIB libspdk_accel.a 00:02:15.442 LIB libspdk_nvme.a 00:02:15.442 CC lib/bdev/bdev.o 00:02:15.442 CC lib/bdev/bdev_zone.o 00:02:15.442 CC lib/bdev/bdev_rpc.o 00:02:15.442 CC lib/bdev/part.o 00:02:15.442 CC lib/bdev/scsi_nvme.o 00:02:16.060 LIB libspdk_blob.a 00:02:16.401 CC lib/lvol/lvol.o 00:02:16.401 CC lib/blobfs/blobfs.o 00:02:16.401 CC lib/blobfs/tree.o 00:02:16.970 LIB libspdk_lvol.a 00:02:16.970 LIB libspdk_blobfs.a 00:02:17.229 LIB libspdk_bdev.a 00:02:17.488 CC lib/nbd/nbd.o 00:02:17.488 CC lib/nbd/nbd_rpc.o 00:02:17.488 CC lib/ublk/ublk.o 00:02:17.488 CC lib/ftl/ftl_core.o 00:02:17.488 CC lib/ublk/ublk_rpc.o 00:02:17.488 CC lib/nvmf/ctrlr.o 00:02:17.488 CC lib/nvmf/ctrlr_discovery.o 00:02:17.488 CC lib/nvmf/subsystem.o 00:02:17.488 CC lib/ftl/ftl_init.o 00:02:17.488 CC lib/nvmf/ctrlr_bdev.o 00:02:17.488 CC lib/ftl/ftl_io.o 00:02:17.488 CC lib/ftl/ftl_layout.o 00:02:17.488 CC lib/nvmf/nvmf_rpc.o 00:02:17.488 CC lib/nvmf/nvmf.o 00:02:17.488 CC lib/ftl/ftl_debug.o 00:02:17.488 CC lib/nvmf/tcp.o 00:02:17.488 CC lib/scsi/dev.o 00:02:17.488 CC lib/ftl/ftl_sb.o 00:02:17.488 CC lib/nvmf/transport.o 00:02:17.488 CC lib/scsi/lun.o 00:02:17.488 CC lib/ftl/ftl_l2p.o 00:02:17.488 CC lib/scsi/port.o 00:02:17.488 CC lib/ftl/ftl_l2p_flat.o 00:02:17.488 CC lib/nvmf/vfio_user.o 00:02:17.746 CC lib/scsi/scsi.o 00:02:17.746 CC lib/nvmf/rdma.o 00:02:17.746 CC lib/ftl/ftl_nv_cache.o 00:02:17.746 CC lib/scsi/scsi_bdev.o 00:02:17.746 CC lib/ftl/ftl_band.o 00:02:17.746 CC lib/scsi/scsi_pr.o 00:02:17.746 CC lib/ftl/ftl_band_ops.o 00:02:17.746 CC lib/scsi/scsi_rpc.o 00:02:17.746 CC lib/ftl/ftl_writer.o 00:02:17.746 CC lib/scsi/task.o 00:02:17.747 CC lib/ftl/ftl_rq.o 00:02:17.747 CC lib/ftl/ftl_reloc.o 00:02:17.747 CC lib/ftl/ftl_l2p_cache.o 00:02:17.747 CC lib/ftl/ftl_p2l.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:17.747 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:17.747 CC lib/ftl/utils/ftl_conf.o 00:02:17.747 CC lib/ftl/utils/ftl_md.o 00:02:17.747 CC lib/ftl/utils/ftl_mempool.o 00:02:17.747 CC lib/ftl/utils/ftl_bitmap.o 00:02:17.747 CC lib/ftl/utils/ftl_property.o 00:02:17.747 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:17.747 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:17.747 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:17.747 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:17.747 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:17.747 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:17.747 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:17.747 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:17.747 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:17.747 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:17.747 CC lib/ftl/base/ftl_base_dev.o 00:02:17.747 CC lib/ftl/base/ftl_base_bdev.o 00:02:17.747 CC lib/ftl/ftl_trace.o 00:02:18.005 LIB libspdk_nbd.a 00:02:18.005 LIB libspdk_scsi.a 00:02:18.263 LIB libspdk_ublk.a 00:02:18.263 LIB libspdk_ftl.a 00:02:18.263 CC lib/iscsi/conn.o 00:02:18.263 CC lib/iscsi/init_grp.o 00:02:18.263 CC lib/iscsi/md5.o 00:02:18.263 CC lib/iscsi/iscsi.o 00:02:18.263 CC lib/iscsi/param.o 00:02:18.263 CC lib/iscsi/portal_grp.o 00:02:18.263 CC lib/iscsi/tgt_node.o 00:02:18.263 CC lib/iscsi/iscsi_subsystem.o 00:02:18.263 CC lib/iscsi/iscsi_rpc.o 00:02:18.263 CC lib/iscsi/task.o 00:02:18.522 CC lib/vhost/vhost.o 00:02:18.522 CC lib/vhost/vhost_rpc.o 00:02:18.522 CC lib/vhost/rte_vhost_user.o 00:02:18.522 CC lib/vhost/vhost_scsi.o 00:02:18.522 CC lib/vhost/vhost_blk.o 00:02:19.088 LIB libspdk_nvmf.a 00:02:19.088 LIB libspdk_vhost.a 00:02:19.088 LIB libspdk_iscsi.a 00:02:19.655 CC module/env_dpdk/env_dpdk_rpc.o 00:02:19.655 CC module/vfu_device/vfu_virtio.o 00:02:19.655 CC module/vfu_device/vfu_virtio_scsi.o 00:02:19.655 CC module/vfu_device/vfu_virtio_rpc.o 00:02:19.655 CC module/vfu_device/vfu_virtio_blk.o 00:02:19.655 CC module/accel/error/accel_error.o 00:02:19.655 CC module/accel/error/accel_error_rpc.o 00:02:19.655 CC module/accel/ioat/accel_ioat.o 00:02:19.655 CC module/accel/ioat/accel_ioat_rpc.o 00:02:19.655 CC module/sock/posix/posix.o 00:02:19.655 CC module/scheduler/gscheduler/gscheduler.o 00:02:19.655 CC module/blob/bdev/blob_bdev.o 00:02:19.655 LIB libspdk_env_dpdk_rpc.a 00:02:19.655 CC module/accel/iaa/accel_iaa.o 00:02:19.655 CC module/accel/iaa/accel_iaa_rpc.o 00:02:19.655 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:19.655 CC module/accel/dsa/accel_dsa.o 00:02:19.655 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:19.655 CC module/accel/dsa/accel_dsa_rpc.o 00:02:19.655 CC module/keyring/file/keyring.o 00:02:19.655 CC module/keyring/file/keyring_rpc.o 00:02:19.914 LIB libspdk_scheduler_gscheduler.a 00:02:19.914 LIB libspdk_accel_error.a 00:02:19.914 LIB libspdk_accel_ioat.a 00:02:19.914 LIB libspdk_scheduler_dpdk_governor.a 00:02:19.914 LIB libspdk_keyring_file.a 00:02:19.914 LIB libspdk_accel_iaa.a 00:02:19.914 LIB libspdk_scheduler_dynamic.a 00:02:19.914 LIB libspdk_blob_bdev.a 00:02:19.914 LIB libspdk_accel_dsa.a 00:02:19.914 LIB libspdk_vfu_device.a 00:02:20.173 LIB libspdk_sock_posix.a 00:02:20.173 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:20.173 CC module/blobfs/bdev/blobfs_bdev.o 00:02:20.173 CC module/bdev/delay/vbdev_delay.o 00:02:20.173 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:20.173 CC module/bdev/split/vbdev_split.o 00:02:20.173 CC module/bdev/aio/bdev_aio.o 00:02:20.173 CC module/bdev/aio/bdev_aio_rpc.o 00:02:20.173 CC module/bdev/gpt/gpt.o 00:02:20.173 CC module/bdev/split/vbdev_split_rpc.o 00:02:20.173 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:20.173 CC module/bdev/ftl/bdev_ftl.o 00:02:20.173 CC module/bdev/nvme/nvme_rpc.o 00:02:20.173 CC module/bdev/nvme/bdev_nvme.o 00:02:20.173 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:20.173 CC module/bdev/gpt/vbdev_gpt.o 00:02:20.173 CC module/bdev/null/bdev_null.o 00:02:20.173 CC module/bdev/nvme/bdev_mdns_client.o 00:02:20.173 CC module/bdev/nvme/vbdev_opal.o 00:02:20.173 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:20.173 CC module/bdev/null/bdev_null_rpc.o 00:02:20.173 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:20.173 CC module/bdev/raid/bdev_raid.o 00:02:20.173 CC module/bdev/lvol/vbdev_lvol.o 00:02:20.173 CC module/bdev/raid/bdev_raid_rpc.o 00:02:20.173 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:20.173 CC module/bdev/raid/bdev_raid_sb.o 00:02:20.173 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:20.173 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:20.173 CC module/bdev/iscsi/bdev_iscsi.o 00:02:20.173 CC module/bdev/raid/raid0.o 00:02:20.173 CC module/bdev/raid/raid1.o 00:02:20.173 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:20.173 CC module/bdev/raid/concat.o 00:02:20.173 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:20.433 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:20.433 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:20.433 CC module/bdev/malloc/bdev_malloc.o 00:02:20.433 CC module/bdev/passthru/vbdev_passthru.o 00:02:20.433 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:20.433 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:20.433 CC module/bdev/error/vbdev_error.o 00:02:20.433 CC module/bdev/error/vbdev_error_rpc.o 00:02:20.433 LIB libspdk_blobfs_bdev.a 00:02:20.433 LIB libspdk_bdev_split.a 00:02:20.433 LIB libspdk_bdev_null.a 00:02:20.433 LIB libspdk_bdev_ftl.a 00:02:20.433 LIB libspdk_bdev_aio.a 00:02:20.433 LIB libspdk_bdev_zone_block.a 00:02:20.692 LIB libspdk_bdev_delay.a 00:02:20.692 LIB libspdk_bdev_gpt.a 00:02:20.692 LIB libspdk_bdev_malloc.a 00:02:20.692 LIB libspdk_bdev_iscsi.a 00:02:20.692 LIB libspdk_bdev_error.a 00:02:20.692 LIB libspdk_bdev_lvol.a 00:02:20.692 LIB libspdk_bdev_passthru.a 00:02:20.692 LIB libspdk_bdev_virtio.a 00:02:20.951 LIB libspdk_bdev_raid.a 00:02:21.888 LIB libspdk_bdev_nvme.a 00:02:22.147 CC module/event/subsystems/keyring/keyring.o 00:02:22.147 CC module/event/subsystems/vmd/vmd.o 00:02:22.147 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:22.147 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:22.147 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:22.147 CC module/event/subsystems/scheduler/scheduler.o 00:02:22.147 CC module/event/subsystems/sock/sock.o 00:02:22.147 CC module/event/subsystems/iobuf/iobuf.o 00:02:22.147 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:22.406 LIB libspdk_event_keyring.a 00:02:22.406 LIB libspdk_event_vfu_tgt.a 00:02:22.406 LIB libspdk_event_vmd.a 00:02:22.406 LIB libspdk_event_vhost_blk.a 00:02:22.406 LIB libspdk_event_sock.a 00:02:22.406 LIB libspdk_event_scheduler.a 00:02:22.406 LIB libspdk_event_iobuf.a 00:02:22.666 CC module/event/subsystems/accel/accel.o 00:02:22.666 LIB libspdk_event_accel.a 00:02:22.924 CC module/event/subsystems/bdev/bdev.o 00:02:23.183 LIB libspdk_event_bdev.a 00:02:23.441 CC module/event/subsystems/nbd/nbd.o 00:02:23.442 CC module/event/subsystems/ublk/ublk.o 00:02:23.442 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:23.442 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:23.442 CC module/event/subsystems/scsi/scsi.o 00:02:23.442 LIB libspdk_event_nbd.a 00:02:23.442 LIB libspdk_event_ublk.a 00:02:23.442 LIB libspdk_event_scsi.a 00:02:23.701 LIB libspdk_event_nvmf.a 00:02:23.961 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:23.961 CC module/event/subsystems/iscsi/iscsi.o 00:02:23.961 LIB libspdk_event_vhost_scsi.a 00:02:23.961 LIB libspdk_event_iscsi.a 00:02:24.226 CC app/spdk_nvme_identify/identify.o 00:02:24.226 CXX app/trace/trace.o 00:02:24.226 CC app/spdk_top/spdk_top.o 00:02:24.226 CC app/spdk_nvme_perf/perf.o 00:02:24.226 CC app/spdk_nvme_discover/discovery_aer.o 00:02:24.226 CC app/trace_record/trace_record.o 00:02:24.226 CC app/spdk_lspci/spdk_lspci.o 00:02:24.226 TEST_HEADER include/spdk/accel.h 00:02:24.226 TEST_HEADER include/spdk/accel_module.h 00:02:24.226 TEST_HEADER include/spdk/assert.h 00:02:24.226 TEST_HEADER include/spdk/barrier.h 00:02:24.226 CC test/rpc_client/rpc_client_test.o 00:02:24.226 TEST_HEADER include/spdk/base64.h 00:02:24.226 TEST_HEADER include/spdk/bdev.h 00:02:24.226 TEST_HEADER include/spdk/bdev_module.h 00:02:24.226 TEST_HEADER include/spdk/bdev_zone.h 00:02:24.226 TEST_HEADER include/spdk/bit_array.h 00:02:24.226 TEST_HEADER include/spdk/bit_pool.h 00:02:24.226 TEST_HEADER include/spdk/blob_bdev.h 00:02:24.226 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:24.226 TEST_HEADER include/spdk/blobfs.h 00:02:24.226 TEST_HEADER include/spdk/blob.h 00:02:24.226 TEST_HEADER include/spdk/conf.h 00:02:24.226 TEST_HEADER include/spdk/config.h 00:02:24.226 TEST_HEADER include/spdk/cpuset.h 00:02:24.226 TEST_HEADER include/spdk/crc16.h 00:02:24.226 CC app/iscsi_tgt/iscsi_tgt.o 00:02:24.226 TEST_HEADER include/spdk/crc32.h 00:02:24.226 TEST_HEADER include/spdk/crc64.h 00:02:24.226 TEST_HEADER include/spdk/dif.h 00:02:24.226 TEST_HEADER include/spdk/dma.h 00:02:24.226 TEST_HEADER include/spdk/endian.h 00:02:24.226 TEST_HEADER include/spdk/env_dpdk.h 00:02:24.226 TEST_HEADER include/spdk/env.h 00:02:24.487 CC app/spdk_dd/spdk_dd.o 00:02:24.487 TEST_HEADER include/spdk/event.h 00:02:24.487 TEST_HEADER include/spdk/fd_group.h 00:02:24.487 TEST_HEADER include/spdk/fd.h 00:02:24.487 TEST_HEADER include/spdk/file.h 00:02:24.487 CC app/nvmf_tgt/nvmf_main.o 00:02:24.487 CC app/vhost/vhost.o 00:02:24.487 TEST_HEADER include/spdk/ftl.h 00:02:24.487 TEST_HEADER include/spdk/gpt_spec.h 00:02:24.487 TEST_HEADER include/spdk/hexlify.h 00:02:24.487 CC app/spdk_tgt/spdk_tgt.o 00:02:24.487 TEST_HEADER include/spdk/histogram_data.h 00:02:24.487 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:24.487 TEST_HEADER include/spdk/idxd.h 00:02:24.487 TEST_HEADER include/spdk/idxd_spec.h 00:02:24.487 TEST_HEADER include/spdk/init.h 00:02:24.487 TEST_HEADER include/spdk/ioat.h 00:02:24.487 TEST_HEADER include/spdk/ioat_spec.h 00:02:24.487 TEST_HEADER include/spdk/iscsi_spec.h 00:02:24.487 TEST_HEADER include/spdk/json.h 00:02:24.487 TEST_HEADER include/spdk/jsonrpc.h 00:02:24.487 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:24.487 TEST_HEADER include/spdk/keyring.h 00:02:24.487 CC test/nvme/aer/aer.o 00:02:24.487 CC app/fio/nvme/fio_plugin.o 00:02:24.487 CC examples/nvme/arbitration/arbitration.o 00:02:24.487 TEST_HEADER include/spdk/keyring_module.h 00:02:24.487 CC examples/accel/perf/accel_perf.o 00:02:24.487 CC examples/nvme/reconnect/reconnect.o 00:02:24.487 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:24.487 CC test/nvme/sgl/sgl.o 00:02:24.487 CC examples/nvme/abort/abort.o 00:02:24.487 CC test/env/vtophys/vtophys.o 00:02:24.487 TEST_HEADER include/spdk/likely.h 00:02:24.487 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:24.487 CC examples/idxd/perf/perf.o 00:02:24.487 CC test/nvme/startup/startup.o 00:02:24.487 CC examples/nvme/hotplug/hotplug.o 00:02:24.487 TEST_HEADER include/spdk/log.h 00:02:24.487 CC test/nvme/overhead/overhead.o 00:02:24.487 CC examples/ioat/verify/verify.o 00:02:24.487 CC examples/nvme/hello_world/hello_world.o 00:02:24.487 TEST_HEADER include/spdk/lvol.h 00:02:24.487 CC test/event/event_perf/event_perf.o 00:02:24.487 CC examples/vmd/lsvmd/lsvmd.o 00:02:24.487 CC test/nvme/simple_copy/simple_copy.o 00:02:24.487 CC test/event/reactor/reactor.o 00:02:24.487 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:24.487 CC test/app/histogram_perf/histogram_perf.o 00:02:24.487 TEST_HEADER include/spdk/memory.h 00:02:24.487 CC examples/vmd/led/led.o 00:02:24.487 TEST_HEADER include/spdk/mmio.h 00:02:24.487 CC examples/ioat/perf/perf.o 00:02:24.487 CC test/nvme/boot_partition/boot_partition.o 00:02:24.487 CC test/nvme/reset/reset.o 00:02:24.487 CC test/env/memory/memory_ut.o 00:02:24.487 CC examples/sock/hello_world/hello_sock.o 00:02:24.487 CC examples/util/zipf/zipf.o 00:02:24.487 CC test/nvme/reserve/reserve.o 00:02:24.487 TEST_HEADER include/spdk/nbd.h 00:02:24.487 CC test/env/pci/pci_ut.o 00:02:24.487 CC test/app/jsoncat/jsoncat.o 00:02:24.487 CC test/thread/lock/spdk_lock.o 00:02:24.487 CC test/nvme/connect_stress/connect_stress.o 00:02:24.487 TEST_HEADER include/spdk/notify.h 00:02:24.487 CC test/nvme/compliance/nvme_compliance.o 00:02:24.487 TEST_HEADER include/spdk/nvme.h 00:02:24.487 CC test/thread/poller_perf/poller_perf.o 00:02:24.487 CC test/event/reactor_perf/reactor_perf.o 00:02:24.487 TEST_HEADER include/spdk/nvme_intel.h 00:02:24.487 CC test/nvme/err_injection/err_injection.o 00:02:24.487 CC test/nvme/e2edp/nvme_dp.o 00:02:24.487 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:24.487 CC app/fio/bdev/fio_plugin.o 00:02:24.487 CC test/event/app_repeat/app_repeat.o 00:02:24.487 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:24.487 TEST_HEADER include/spdk/nvme_spec.h 00:02:24.487 CC examples/blob/hello_world/hello_blob.o 00:02:24.487 TEST_HEADER include/spdk/nvme_zns.h 00:02:24.487 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:24.487 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:24.487 TEST_HEADER include/spdk/nvmf.h 00:02:24.487 CC examples/blob/cli/blobcli.o 00:02:24.487 TEST_HEADER include/spdk/nvmf_spec.h 00:02:24.487 CC test/bdev/bdevio/bdevio.o 00:02:24.487 TEST_HEADER include/spdk/nvmf_transport.h 00:02:24.487 CC test/event/scheduler/scheduler.o 00:02:24.487 CC test/accel/dif/dif.o 00:02:24.487 LINK spdk_lspci 00:02:24.487 TEST_HEADER include/spdk/opal.h 00:02:24.487 TEST_HEADER include/spdk/opal_spec.h 00:02:24.487 CC examples/thread/thread/thread_ex.o 00:02:24.487 TEST_HEADER include/spdk/pci_ids.h 00:02:24.487 TEST_HEADER include/spdk/pipe.h 00:02:24.487 CC test/blobfs/mkfs/mkfs.o 00:02:24.487 CC examples/bdev/hello_world/hello_bdev.o 00:02:24.487 TEST_HEADER include/spdk/queue.h 00:02:24.487 CC examples/bdev/bdevperf/bdevperf.o 00:02:24.487 CC test/app/bdev_svc/bdev_svc.o 00:02:24.487 TEST_HEADER include/spdk/reduce.h 00:02:24.487 TEST_HEADER include/spdk/rpc.h 00:02:24.487 CC examples/nvmf/nvmf/nvmf.o 00:02:24.487 TEST_HEADER include/spdk/scheduler.h 00:02:24.487 CC test/dma/test_dma/test_dma.o 00:02:24.487 TEST_HEADER include/spdk/scsi.h 00:02:24.487 TEST_HEADER include/spdk/scsi_spec.h 00:02:24.487 TEST_HEADER include/spdk/sock.h 00:02:24.487 TEST_HEADER include/spdk/stdinc.h 00:02:24.487 TEST_HEADER include/spdk/string.h 00:02:24.487 TEST_HEADER include/spdk/thread.h 00:02:24.487 LINK spdk_nvme_discover 00:02:24.487 LINK rpc_client_test 00:02:24.487 TEST_HEADER include/spdk/trace.h 00:02:24.487 TEST_HEADER include/spdk/trace_parser.h 00:02:24.487 CC test/env/mem_callbacks/mem_callbacks.o 00:02:24.487 TEST_HEADER include/spdk/tree.h 00:02:24.487 TEST_HEADER include/spdk/ublk.h 00:02:24.487 TEST_HEADER include/spdk/util.h 00:02:24.487 TEST_HEADER include/spdk/uuid.h 00:02:24.487 TEST_HEADER include/spdk/version.h 00:02:24.487 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:24.487 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:24.487 TEST_HEADER include/spdk/vhost.h 00:02:24.487 CC test/lvol/esnap/esnap.o 00:02:24.487 TEST_HEADER include/spdk/vmd.h 00:02:24.487 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:24.487 TEST_HEADER include/spdk/xor.h 00:02:24.487 TEST_HEADER include/spdk/zipf.h 00:02:24.487 CXX test/cpp_headers/accel.o 00:02:24.753 LINK spdk_trace_record 00:02:24.753 LINK lsvmd 00:02:24.753 LINK vtophys 00:02:24.753 LINK led 00:02:24.753 LINK nvmf_tgt 00:02:24.753 LINK histogram_perf 00:02:24.753 LINK reactor 00:02:24.753 LINK event_perf 00:02:24.753 LINK vhost 00:02:24.753 LINK jsoncat 00:02:24.753 LINK iscsi_tgt 00:02:24.753 LINK interrupt_tgt 00:02:24.753 LINK reactor_perf 00:02:24.753 LINK poller_perf 00:02:24.753 LINK zipf 00:02:24.753 LINK startup 00:02:24.753 fio_plugin.c:1491:29: warning: field 'ruhs' with variable sized type 'struct spdk_nvme_fdp_ruhs' not at the end of a struct or class is a GNU extension [-Wgnu-variable-sized-type-not-at-end] 00:02:24.753 struct spdk_nvme_fdp_ruhs ruhs; 00:02:24.753 ^ 00:02:24.753 LINK app_repeat 00:02:24.753 LINK pmr_persistence 00:02:24.753 LINK connect_stress 00:02:24.753 LINK boot_partition 00:02:24.753 LINK env_dpdk_post_init 00:02:24.753 LINK spdk_tgt 00:02:24.753 LINK cmb_copy 00:02:24.753 LINK err_injection 00:02:24.753 LINK reserve 00:02:24.753 LINK hotplug 00:02:24.753 LINK verify 00:02:24.753 LINK ioat_perf 00:02:24.753 LINK hello_world 00:02:24.753 LINK simple_copy 00:02:24.753 LINK hello_sock 00:02:24.753 LINK bdev_svc 00:02:24.753 LINK aer 00:02:24.753 LINK sgl 00:02:24.753 LINK hello_blob 00:02:24.753 LINK mkfs 00:02:24.753 LINK scheduler 00:02:24.753 LINK reset 00:02:24.753 LINK hello_bdev 00:02:24.753 CXX test/cpp_headers/accel_module.o 00:02:24.753 LINK spdk_trace 00:02:24.753 LINK overhead 00:02:24.753 LINK nvme_dp 00:02:24.753 LINK thread 00:02:25.015 LINK reconnect 00:02:25.015 LINK idxd_perf 00:02:25.015 LINK nvmf 00:02:25.015 LINK abort 00:02:25.015 LINK arbitration 00:02:25.015 LINK spdk_dd 00:02:25.015 LINK nvme_manage 00:02:25.015 LINK bdevio 00:02:25.015 LINK accel_perf 00:02:25.015 LINK dif 00:02:25.015 LINK pci_ut 00:02:25.015 LINK test_dma 00:02:25.015 LINK nvme_compliance 00:02:25.015 CXX test/cpp_headers/assert.o 00:02:25.015 1 warning generated. 00:02:25.015 LINK nvme_fuzz 00:02:25.277 LINK blobcli 00:02:25.277 LINK spdk_nvme 00:02:25.277 LINK spdk_bdev 00:02:25.277 CXX test/cpp_headers/barrier.o 00:02:25.277 LINK mem_callbacks 00:02:25.277 LINK spdk_nvme_identify 00:02:25.277 CC test/nvme/fused_ordering/fused_ordering.o 00:02:25.277 LINK spdk_nvme_perf 00:02:25.536 CXX test/cpp_headers/base64.o 00:02:25.536 CXX test/cpp_headers/bdev.o 00:02:25.536 CXX test/cpp_headers/bdev_module.o 00:02:25.536 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:25.536 CC test/app/stub/stub.o 00:02:25.536 LINK bdevperf 00:02:25.536 CXX test/cpp_headers/bdev_zone.o 00:02:25.536 LINK spdk_top 00:02:25.536 CXX test/cpp_headers/bit_array.o 00:02:25.536 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:25.536 CC test/nvme/fdp/fdp.o 00:02:25.536 LINK memory_ut 00:02:25.536 CXX test/cpp_headers/bit_pool.o 00:02:25.536 CC test/nvme/cuse/cuse.o 00:02:25.536 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:25.536 CXX test/cpp_headers/blob_bdev.o 00:02:25.536 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:25.803 LINK fused_ordering 00:02:25.803 CXX test/cpp_headers/blobfs_bdev.o 00:02:25.803 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:25.803 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:25.803 CXX test/cpp_headers/blobfs.o 00:02:25.803 CXX test/cpp_headers/blob.o 00:02:25.803 CXX test/cpp_headers/conf.o 00:02:25.803 CXX test/cpp_headers/config.o 00:02:25.803 LINK stub 00:02:25.803 CXX test/cpp_headers/cpuset.o 00:02:25.803 LINK doorbell_aers 00:02:25.803 CXX test/cpp_headers/crc16.o 00:02:25.803 CXX test/cpp_headers/crc64.o 00:02:25.803 CXX test/cpp_headers/crc32.o 00:02:25.803 CXX test/cpp_headers/dif.o 00:02:25.803 CXX test/cpp_headers/dma.o 00:02:25.803 LINK fdp 00:02:25.803 CXX test/cpp_headers/endian.o 00:02:25.803 CXX test/cpp_headers/env_dpdk.o 00:02:25.803 CXX test/cpp_headers/env.o 00:02:25.803 CXX test/cpp_headers/event.o 00:02:26.065 CXX test/cpp_headers/fd_group.o 00:02:26.065 CXX test/cpp_headers/fd.o 00:02:26.065 CXX test/cpp_headers/file.o 00:02:26.065 CXX test/cpp_headers/ftl.o 00:02:26.065 CXX test/cpp_headers/hexlify.o 00:02:26.065 CXX test/cpp_headers/gpt_spec.o 00:02:26.065 CXX test/cpp_headers/histogram_data.o 00:02:26.065 CXX test/cpp_headers/idxd.o 00:02:26.065 CXX test/cpp_headers/idxd_spec.o 00:02:26.065 CXX test/cpp_headers/init.o 00:02:26.065 CXX test/cpp_headers/ioat.o 00:02:26.065 CXX test/cpp_headers/ioat_spec.o 00:02:26.065 CXX test/cpp_headers/iscsi_spec.o 00:02:26.065 LINK llvm_vfio_fuzz 00:02:26.065 CXX test/cpp_headers/json.o 00:02:26.065 CXX test/cpp_headers/jsonrpc.o 00:02:26.065 CXX test/cpp_headers/keyring.o 00:02:26.065 CXX test/cpp_headers/keyring_module.o 00:02:26.065 CXX test/cpp_headers/likely.o 00:02:26.065 CXX test/cpp_headers/log.o 00:02:26.065 CXX test/cpp_headers/lvol.o 00:02:26.065 CXX test/cpp_headers/memory.o 00:02:26.065 CXX test/cpp_headers/mmio.o 00:02:26.065 CXX test/cpp_headers/nbd.o 00:02:26.065 CXX test/cpp_headers/notify.o 00:02:26.065 CXX test/cpp_headers/nvme.o 00:02:26.065 CXX test/cpp_headers/nvme_intel.o 00:02:26.065 CXX test/cpp_headers/nvme_ocssd.o 00:02:26.065 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:26.065 CXX test/cpp_headers/nvme_spec.o 00:02:26.065 CXX test/cpp_headers/nvme_zns.o 00:02:26.065 CXX test/cpp_headers/nvmf_cmd.o 00:02:26.065 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:26.325 CXX test/cpp_headers/nvmf.o 00:02:26.325 CXX test/cpp_headers/nvmf_spec.o 00:02:26.325 CXX test/cpp_headers/nvmf_transport.o 00:02:26.325 CXX test/cpp_headers/opal.o 00:02:26.325 CXX test/cpp_headers/opal_spec.o 00:02:26.325 CXX test/cpp_headers/pci_ids.o 00:02:26.325 LINK vhost_fuzz 00:02:26.325 CXX test/cpp_headers/pipe.o 00:02:26.325 CXX test/cpp_headers/reduce.o 00:02:26.325 CXX test/cpp_headers/queue.o 00:02:26.325 CXX test/cpp_headers/rpc.o 00:02:26.325 CXX test/cpp_headers/scheduler.o 00:02:26.325 CXX test/cpp_headers/scsi.o 00:02:26.325 CXX test/cpp_headers/scsi_spec.o 00:02:26.325 CXX test/cpp_headers/sock.o 00:02:26.325 CXX test/cpp_headers/stdinc.o 00:02:26.325 CXX test/cpp_headers/string.o 00:02:26.325 CXX test/cpp_headers/thread.o 00:02:26.325 CXX test/cpp_headers/trace.o 00:02:26.325 CXX test/cpp_headers/trace_parser.o 00:02:26.325 CXX test/cpp_headers/tree.o 00:02:26.325 CXX test/cpp_headers/ublk.o 00:02:26.325 CXX test/cpp_headers/util.o 00:02:26.325 CXX test/cpp_headers/uuid.o 00:02:26.325 CXX test/cpp_headers/version.o 00:02:26.325 CXX test/cpp_headers/vfio_user_pci.o 00:02:26.325 CXX test/cpp_headers/vfio_user_spec.o 00:02:26.325 CXX test/cpp_headers/vhost.o 00:02:26.325 CXX test/cpp_headers/vmd.o 00:02:26.325 CXX test/cpp_headers/xor.o 00:02:26.325 CXX test/cpp_headers/zipf.o 00:02:26.325 LINK spdk_lock 00:02:26.584 LINK llvm_nvme_fuzz 00:02:26.584 LINK cuse 00:02:27.150 LINK iscsi_fuzz 00:02:29.056 LINK esnap 00:02:29.314 00:02:29.315 real 0m43.609s 00:02:29.315 user 6m40.910s 00:02:29.315 sys 2m30.792s 00:02:29.315 19:58:13 -- common/autotest_common.sh@1122 -- $ xtrace_disable 00:02:29.315 19:58:13 -- common/autotest_common.sh@10 -- $ set +x 00:02:29.315 ************************************ 00:02:29.315 END TEST make 00:02:29.315 ************************************ 00:02:29.315 19:58:13 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:29.315 19:58:13 -- pm/common@30 -- $ signal_monitor_resources TERM 00:02:29.315 19:58:13 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:02:29.315 19:58:13 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.315 19:58:13 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:29.315 19:58:13 -- pm/common@45 -- $ pid=1492072 00:02:29.315 19:58:13 -- pm/common@52 -- $ sudo kill -TERM 1492072 00:02:29.315 19:58:13 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.315 19:58:13 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:29.315 19:58:13 -- pm/common@45 -- $ pid=1492079 00:02:29.315 19:58:13 -- pm/common@52 -- $ sudo kill -TERM 1492079 00:02:29.315 19:58:13 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.315 19:58:13 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:29.315 19:58:13 -- pm/common@45 -- $ pid=1492081 00:02:29.315 19:58:13 -- pm/common@52 -- $ sudo kill -TERM 1492081 00:02:29.315 19:58:13 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.315 19:58:13 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:29.315 19:58:13 -- pm/common@45 -- $ pid=1492073 00:02:29.315 19:58:13 -- pm/common@52 -- $ sudo kill -TERM 1492073 00:02:29.574 19:58:13 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:29.574 19:58:13 -- nvmf/common.sh@7 -- # uname -s 00:02:29.574 19:58:13 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:29.574 19:58:13 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:29.574 19:58:13 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:29.574 19:58:13 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:29.574 19:58:13 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:29.574 19:58:13 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:29.574 19:58:13 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:29.574 19:58:13 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:29.574 19:58:13 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:29.574 19:58:13 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:29.574 19:58:13 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:02:29.574 19:58:13 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:02:29.574 19:58:13 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:29.574 19:58:13 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:29.574 19:58:13 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:29.574 19:58:13 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:29.574 19:58:13 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:29.574 19:58:13 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:29.574 19:58:13 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:29.574 19:58:13 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:29.574 19:58:13 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:29.574 19:58:13 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:29.574 19:58:13 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:29.574 19:58:13 -- paths/export.sh@5 -- # export PATH 00:02:29.574 19:58:13 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:29.574 19:58:13 -- nvmf/common.sh@47 -- # : 0 00:02:29.574 19:58:13 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:29.574 19:58:13 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:29.574 19:58:13 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:29.574 19:58:13 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:29.574 19:58:13 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:29.574 19:58:13 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:29.574 19:58:13 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:29.574 19:58:13 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:29.574 19:58:13 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:29.574 19:58:13 -- spdk/autotest.sh@32 -- # uname -s 00:02:29.574 19:58:13 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:29.574 19:58:13 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:29.574 19:58:13 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:29.574 19:58:13 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:29.574 19:58:13 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:29.574 19:58:13 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:29.574 19:58:13 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:29.574 19:58:13 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:29.574 19:58:13 -- spdk/autotest.sh@48 -- # udevadm_pid=1549834 00:02:29.574 19:58:13 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:29.574 19:58:13 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:29.574 19:58:13 -- pm/common@17 -- # local monitor 00:02:29.574 19:58:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.574 19:58:13 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1549836 00:02:29.574 19:58:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.574 19:58:13 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1549839 00:02:29.574 19:58:13 -- pm/common@21 -- # date +%s 00:02:29.574 19:58:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.574 19:58:13 -- pm/common@21 -- # date +%s 00:02:29.574 19:58:13 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1549842 00:02:29.574 19:58:13 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:29.574 19:58:13 -- pm/common@21 -- # date +%s 00:02:29.574 19:58:13 -- pm/common@23 -- # MONITOR_RESOURCES_PIDS["$monitor"]=1549847 00:02:29.574 19:58:13 -- pm/common@26 -- # sleep 1 00:02:29.575 19:58:13 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:02:29.575 19:58:13 -- pm/common@21 -- # date +%s 00:02:29.575 19:58:13 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:02:29.575 19:58:13 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:02:29.575 19:58:13 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:02:29.575 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714154293_collect-vmstat.pm.log 00:02:29.575 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714154293_collect-bmc-pm.bmc.pm.log 00:02:29.575 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714154293_collect-cpu-temp.pm.log 00:02:29.575 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autotest.sh.1714154293_collect-cpu-load.pm.log 00:02:30.508 19:58:14 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:30.508 19:58:14 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:30.508 19:58:14 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:30.508 19:58:14 -- common/autotest_common.sh@10 -- # set +x 00:02:30.508 19:58:14 -- spdk/autotest.sh@59 -- # create_test_list 00:02:30.508 19:58:14 -- common/autotest_common.sh@744 -- # xtrace_disable 00:02:30.508 19:58:14 -- common/autotest_common.sh@10 -- # set +x 00:02:30.767 19:58:14 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:30.767 19:58:14 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:30.767 19:58:14 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:30.767 19:58:14 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:30.767 19:58:14 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:30.767 19:58:14 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:30.767 19:58:14 -- common/autotest_common.sh@1451 -- # uname 00:02:30.767 19:58:14 -- common/autotest_common.sh@1451 -- # '[' Linux = FreeBSD ']' 00:02:30.767 19:58:14 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:30.767 19:58:14 -- common/autotest_common.sh@1471 -- # uname 00:02:30.767 19:58:15 -- common/autotest_common.sh@1471 -- # [[ Linux = FreeBSD ]] 00:02:30.767 19:58:15 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:30.767 19:58:15 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=clang 00:02:30.767 19:58:15 -- spdk/autotest.sh@72 -- # hash lcov 00:02:30.767 19:58:15 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:02:30.767 19:58:15 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:02:30.767 19:58:15 -- common/autotest_common.sh@720 -- # xtrace_disable 00:02:30.767 19:58:15 -- common/autotest_common.sh@10 -- # set +x 00:02:30.767 19:58:15 -- spdk/autotest.sh@91 -- # rm -f 00:02:30.767 19:58:15 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:35.054 0000:1a:00.0 (8086 0a54): Already using the nvme driver 00:02:35.054 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:02:35.054 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:02:36.960 19:58:21 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:02:36.960 19:58:21 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:02:36.960 19:58:21 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:02:36.960 19:58:21 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:02:36.960 19:58:21 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:02:36.960 19:58:21 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:02:36.960 19:58:21 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:02:36.960 19:58:21 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:36.960 19:58:21 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:02:36.960 19:58:21 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:02:36.960 19:58:21 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:02:36.960 19:58:21 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:02:36.960 19:58:21 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:02:36.960 19:58:21 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:02:36.960 19:58:21 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:02:36.960 No valid GPT data, bailing 00:02:36.960 19:58:21 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:02:36.960 19:58:21 -- scripts/common.sh@391 -- # pt= 00:02:36.960 19:58:21 -- scripts/common.sh@392 -- # return 1 00:02:36.960 19:58:21 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:02:36.960 1+0 records in 00:02:36.960 1+0 records out 00:02:36.960 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00169707 s, 618 MB/s 00:02:36.960 19:58:21 -- spdk/autotest.sh@118 -- # sync 00:02:36.960 19:58:21 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:02:36.960 19:58:21 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:02:36.960 19:58:21 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:02:42.235 19:58:26 -- spdk/autotest.sh@124 -- # uname -s 00:02:42.235 19:58:26 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:02:42.235 19:58:26 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:42.235 19:58:26 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:42.235 19:58:26 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:42.235 19:58:26 -- common/autotest_common.sh@10 -- # set +x 00:02:42.235 ************************************ 00:02:42.235 START TEST setup.sh 00:02:42.235 ************************************ 00:02:42.235 19:58:26 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:02:42.235 * Looking for test storage... 00:02:42.235 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:42.235 19:58:26 -- setup/test-setup.sh@10 -- # uname -s 00:02:42.236 19:58:26 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:02:42.236 19:58:26 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:42.236 19:58:26 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:42.236 19:58:26 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:42.236 19:58:26 -- common/autotest_common.sh@10 -- # set +x 00:02:42.236 ************************************ 00:02:42.236 START TEST acl 00:02:42.236 ************************************ 00:02:42.236 19:58:26 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:02:42.236 * Looking for test storage... 00:02:42.236 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:02:42.236 19:58:26 -- setup/acl.sh@10 -- # get_zoned_devs 00:02:42.236 19:58:26 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:02:42.236 19:58:26 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:02:42.236 19:58:26 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:02:42.236 19:58:26 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:02:42.236 19:58:26 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:02:42.236 19:58:26 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:02:42.236 19:58:26 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:02:42.236 19:58:26 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:02:42.236 19:58:26 -- setup/acl.sh@12 -- # devs=() 00:02:42.236 19:58:26 -- setup/acl.sh@12 -- # declare -a devs 00:02:42.236 19:58:26 -- setup/acl.sh@13 -- # drivers=() 00:02:42.236 19:58:26 -- setup/acl.sh@13 -- # declare -A drivers 00:02:42.236 19:58:26 -- setup/acl.sh@51 -- # setup reset 00:02:42.236 19:58:26 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:42.236 19:58:26 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:02:48.805 19:58:31 -- setup/acl.sh@52 -- # collect_setup_devs 00:02:48.805 19:58:31 -- setup/acl.sh@16 -- # local dev driver 00:02:48.805 19:58:31 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:48.805 19:58:31 -- setup/acl.sh@15 -- # setup output status 00:02:48.805 19:58:31 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:48.805 19:58:31 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:02:51.342 Hugepages 00:02:51.342 node hugesize free / total 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 00:02:51.342 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:1a:00.0 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@22 -- # devs+=("$dev") 00:02:51.342 19:58:35 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:02:51.342 19:58:35 -- setup/acl.sh@20 -- # continue 00:02:51.342 19:58:35 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:02:51.342 19:58:35 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:02:51.342 19:58:35 -- setup/acl.sh@54 -- # run_test denied denied 00:02:51.342 19:58:35 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:02:51.342 19:58:35 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:02:51.342 19:58:35 -- common/autotest_common.sh@10 -- # set +x 00:02:51.342 ************************************ 00:02:51.342 START TEST denied 00:02:51.342 ************************************ 00:02:51.342 19:58:35 -- common/autotest_common.sh@1121 -- # denied 00:02:51.342 19:58:35 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:1a:00.0' 00:02:51.342 19:58:35 -- setup/acl.sh@38 -- # setup output config 00:02:51.342 19:58:35 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:1a:00.0' 00:02:51.342 19:58:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:02:51.342 19:58:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:02:57.991 0000:1a:00.0 (8086 0a54): Skipping denied controller at 0000:1a:00.0 00:02:57.991 19:58:41 -- setup/acl.sh@40 -- # verify 0000:1a:00.0 00:02:57.991 19:58:41 -- setup/acl.sh@28 -- # local dev driver 00:02:57.991 19:58:41 -- setup/acl.sh@30 -- # for dev in "$@" 00:02:57.991 19:58:41 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:1a:00.0 ]] 00:02:57.991 19:58:41 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:1a:00.0/driver 00:02:57.991 19:58:41 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:02:57.991 19:58:41 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:02:57.991 19:58:41 -- setup/acl.sh@41 -- # setup reset 00:02:57.991 19:58:41 -- setup/common.sh@9 -- # [[ reset == output ]] 00:02:57.991 19:58:41 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:04.575 00:03:04.575 real 0m12.816s 00:03:04.575 user 0m4.235s 00:03:04.575 sys 0m7.900s 00:03:04.575 19:58:48 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:04.575 19:58:48 -- common/autotest_common.sh@10 -- # set +x 00:03:04.575 ************************************ 00:03:04.575 END TEST denied 00:03:04.575 ************************************ 00:03:04.575 19:58:48 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:04.575 19:58:48 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:04.575 19:58:48 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:04.575 19:58:48 -- common/autotest_common.sh@10 -- # set +x 00:03:04.575 ************************************ 00:03:04.575 START TEST allowed 00:03:04.575 ************************************ 00:03:04.575 19:58:48 -- common/autotest_common.sh@1121 -- # allowed 00:03:04.575 19:58:48 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:1a:00.0 00:03:04.575 19:58:48 -- setup/acl.sh@45 -- # setup output config 00:03:04.575 19:58:48 -- setup/acl.sh@46 -- # grep -E '0000:1a:00.0 .*: nvme -> .*' 00:03:04.575 19:58:48 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:04.575 19:58:48 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:14.560 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:03:14.560 19:58:57 -- setup/acl.sh@47 -- # verify 00:03:14.560 19:58:57 -- setup/acl.sh@28 -- # local dev driver 00:03:14.560 19:58:57 -- setup/acl.sh@48 -- # setup reset 00:03:14.560 19:58:57 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:14.560 19:58:57 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:18.751 00:03:18.751 real 0m14.398s 00:03:18.751 user 0m3.653s 00:03:18.751 sys 0m7.501s 00:03:18.751 19:59:03 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:18.751 19:59:03 -- common/autotest_common.sh@10 -- # set +x 00:03:18.751 ************************************ 00:03:18.751 END TEST allowed 00:03:18.751 ************************************ 00:03:18.751 00:03:18.751 real 0m36.646s 00:03:18.751 user 0m10.808s 00:03:18.751 sys 0m21.842s 00:03:18.751 19:59:03 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:18.751 19:59:03 -- common/autotest_common.sh@10 -- # set +x 00:03:18.751 ************************************ 00:03:18.751 END TEST acl 00:03:18.751 ************************************ 00:03:18.751 19:59:03 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:18.752 19:59:03 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:18.752 19:59:03 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:18.752 19:59:03 -- common/autotest_common.sh@10 -- # set +x 00:03:19.020 ************************************ 00:03:19.020 START TEST hugepages 00:03:19.020 ************************************ 00:03:19.020 19:59:03 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:19.020 * Looking for test storage... 00:03:19.020 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:19.020 19:59:03 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:19.020 19:59:03 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:19.020 19:59:03 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:19.020 19:59:03 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:19.020 19:59:03 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:19.020 19:59:03 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:19.021 19:59:03 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:19.021 19:59:03 -- setup/common.sh@18 -- # local node= 00:03:19.021 19:59:03 -- setup/common.sh@19 -- # local var val 00:03:19.021 19:59:03 -- setup/common.sh@20 -- # local mem_f mem 00:03:19.021 19:59:03 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:19.021 19:59:03 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:19.021 19:59:03 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:19.021 19:59:03 -- setup/common.sh@28 -- # mapfile -t mem 00:03:19.021 19:59:03 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 73556680 kB' 'MemAvailable: 77212796 kB' 'Buffers: 19064 kB' 'Cached: 11478584 kB' 'SwapCached: 0 kB' 'Active: 8518500 kB' 'Inactive: 3531884 kB' 'Active(anon): 7852984 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 556124 kB' 'Mapped: 203016 kB' 'Shmem: 7300248 kB' 'KReclaimable: 224676 kB' 'Slab: 605516 kB' 'SReclaimable: 224676 kB' 'SUnreclaim: 380840 kB' 'KernelStack: 16352 kB' 'PageTables: 8952 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438204 kB' 'Committed_AS: 9232848 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210820 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.021 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.021 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.283 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.283 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # continue 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # IFS=': ' 00:03:19.284 19:59:03 -- setup/common.sh@31 -- # read -r var val _ 00:03:19.284 19:59:03 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:19.284 19:59:03 -- setup/common.sh@33 -- # echo 2048 00:03:19.284 19:59:03 -- setup/common.sh@33 -- # return 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:19.284 19:59:03 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:19.284 19:59:03 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:19.284 19:59:03 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:19.284 19:59:03 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:19.284 19:59:03 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:19.284 19:59:03 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:19.284 19:59:03 -- setup/hugepages.sh@207 -- # get_nodes 00:03:19.284 19:59:03 -- setup/hugepages.sh@27 -- # local node 00:03:19.284 19:59:03 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.284 19:59:03 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:19.284 19:59:03 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:19.284 19:59:03 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:19.284 19:59:03 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:19.284 19:59:03 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:19.284 19:59:03 -- setup/hugepages.sh@208 -- # clear_hp 00:03:19.284 19:59:03 -- setup/hugepages.sh@37 -- # local node hp 00:03:19.284 19:59:03 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:19.284 19:59:03 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:19.284 19:59:03 -- setup/hugepages.sh@41 -- # echo 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:19.284 19:59:03 -- setup/hugepages.sh@41 -- # echo 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:19.284 19:59:03 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:19.284 19:59:03 -- setup/hugepages.sh@41 -- # echo 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:19.284 19:59:03 -- setup/hugepages.sh@41 -- # echo 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:19.284 19:59:03 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:19.284 19:59:03 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:19.284 19:59:03 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:19.284 19:59:03 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:19.284 19:59:03 -- common/autotest_common.sh@10 -- # set +x 00:03:19.284 ************************************ 00:03:19.284 START TEST default_setup 00:03:19.284 ************************************ 00:03:19.284 19:59:03 -- common/autotest_common.sh@1121 -- # default_setup 00:03:19.284 19:59:03 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:19.284 19:59:03 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:19.284 19:59:03 -- setup/hugepages.sh@51 -- # shift 00:03:19.284 19:59:03 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:19.284 19:59:03 -- setup/hugepages.sh@52 -- # local node_ids 00:03:19.284 19:59:03 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:19.284 19:59:03 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:19.284 19:59:03 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:19.284 19:59:03 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:19.284 19:59:03 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:19.284 19:59:03 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:19.284 19:59:03 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:19.284 19:59:03 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:19.284 19:59:03 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:19.284 19:59:03 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:19.284 19:59:03 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:19.284 19:59:03 -- setup/hugepages.sh@73 -- # return 0 00:03:19.284 19:59:03 -- setup/hugepages.sh@137 -- # setup output 00:03:19.284 19:59:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:19.284 19:59:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:22.596 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:22.596 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:22.855 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:22.855 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:26.140 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:03:28.048 19:59:12 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:28.048 19:59:12 -- setup/hugepages.sh@89 -- # local node 00:03:28.048 19:59:12 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:28.048 19:59:12 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:28.048 19:59:12 -- setup/hugepages.sh@92 -- # local surp 00:03:28.048 19:59:12 -- setup/hugepages.sh@93 -- # local resv 00:03:28.048 19:59:12 -- setup/hugepages.sh@94 -- # local anon 00:03:28.048 19:59:12 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:28.048 19:59:12 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:28.048 19:59:12 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:28.048 19:59:12 -- setup/common.sh@18 -- # local node= 00:03:28.048 19:59:12 -- setup/common.sh@19 -- # local var val 00:03:28.048 19:59:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.048 19:59:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.048 19:59:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.048 19:59:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.048 19:59:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.048 19:59:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75705080 kB' 'MemAvailable: 79361048 kB' 'Buffers: 19064 kB' 'Cached: 11478744 kB' 'SwapCached: 0 kB' 'Active: 8535036 kB' 'Inactive: 3531884 kB' 'Active(anon): 7869520 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572452 kB' 'Mapped: 202776 kB' 'Shmem: 7300408 kB' 'KReclaimable: 224380 kB' 'Slab: 603296 kB' 'SReclaimable: 224380 kB' 'SUnreclaim: 378916 kB' 'KernelStack: 16256 kB' 'PageTables: 8816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9248532 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210868 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.048 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.048 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:28.049 19:59:12 -- setup/common.sh@33 -- # echo 0 00:03:28.049 19:59:12 -- setup/common.sh@33 -- # return 0 00:03:28.049 19:59:12 -- setup/hugepages.sh@97 -- # anon=0 00:03:28.049 19:59:12 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:28.049 19:59:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.049 19:59:12 -- setup/common.sh@18 -- # local node= 00:03:28.049 19:59:12 -- setup/common.sh@19 -- # local var val 00:03:28.049 19:59:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.049 19:59:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.049 19:59:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.049 19:59:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.049 19:59:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.049 19:59:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75708484 kB' 'MemAvailable: 79364420 kB' 'Buffers: 19064 kB' 'Cached: 11478748 kB' 'SwapCached: 0 kB' 'Active: 8534680 kB' 'Inactive: 3531884 kB' 'Active(anon): 7869164 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572148 kB' 'Mapped: 202760 kB' 'Shmem: 7300412 kB' 'KReclaimable: 224316 kB' 'Slab: 603156 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378840 kB' 'KernelStack: 16240 kB' 'PageTables: 8764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9248544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210852 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.049 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.049 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.050 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.050 19:59:12 -- setup/common.sh@33 -- # echo 0 00:03:28.050 19:59:12 -- setup/common.sh@33 -- # return 0 00:03:28.050 19:59:12 -- setup/hugepages.sh@99 -- # surp=0 00:03:28.050 19:59:12 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:28.050 19:59:12 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:28.050 19:59:12 -- setup/common.sh@18 -- # local node= 00:03:28.050 19:59:12 -- setup/common.sh@19 -- # local var val 00:03:28.050 19:59:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.050 19:59:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.050 19:59:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.050 19:59:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.050 19:59:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.050 19:59:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.050 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75713732 kB' 'MemAvailable: 79369668 kB' 'Buffers: 19064 kB' 'Cached: 11478760 kB' 'SwapCached: 0 kB' 'Active: 8534948 kB' 'Inactive: 3531884 kB' 'Active(anon): 7869432 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572432 kB' 'Mapped: 202700 kB' 'Shmem: 7300424 kB' 'KReclaimable: 224316 kB' 'Slab: 603148 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378832 kB' 'KernelStack: 16240 kB' 'PageTables: 8784 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9248560 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210820 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.051 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.051 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:28.052 19:59:12 -- setup/common.sh@33 -- # echo 0 00:03:28.052 19:59:12 -- setup/common.sh@33 -- # return 0 00:03:28.052 19:59:12 -- setup/hugepages.sh@100 -- # resv=0 00:03:28.052 19:59:12 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:28.052 nr_hugepages=1024 00:03:28.052 19:59:12 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:28.052 resv_hugepages=0 00:03:28.052 19:59:12 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:28.052 surplus_hugepages=0 00:03:28.052 19:59:12 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:28.052 anon_hugepages=0 00:03:28.052 19:59:12 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.052 19:59:12 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:28.052 19:59:12 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:28.052 19:59:12 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:28.052 19:59:12 -- setup/common.sh@18 -- # local node= 00:03:28.052 19:59:12 -- setup/common.sh@19 -- # local var val 00:03:28.052 19:59:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.052 19:59:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.052 19:59:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:28.052 19:59:12 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:28.052 19:59:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.052 19:59:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75714076 kB' 'MemAvailable: 79370012 kB' 'Buffers: 19064 kB' 'Cached: 11478772 kB' 'SwapCached: 0 kB' 'Active: 8534708 kB' 'Inactive: 3531884 kB' 'Active(anon): 7869192 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572156 kB' 'Mapped: 202760 kB' 'Shmem: 7300436 kB' 'KReclaimable: 224316 kB' 'Slab: 603148 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378832 kB' 'KernelStack: 16240 kB' 'PageTables: 8760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9248572 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210820 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.052 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.052 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.053 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.053 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:28.053 19:59:12 -- setup/common.sh@33 -- # echo 1024 00:03:28.053 19:59:12 -- setup/common.sh@33 -- # return 0 00:03:28.053 19:59:12 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:28.053 19:59:12 -- setup/hugepages.sh@112 -- # get_nodes 00:03:28.053 19:59:12 -- setup/hugepages.sh@27 -- # local node 00:03:28.053 19:59:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.053 19:59:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:28.053 19:59:12 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:28.053 19:59:12 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:28.053 19:59:12 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:28.053 19:59:12 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:28.053 19:59:12 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:28.053 19:59:12 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:28.053 19:59:12 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:28.053 19:59:12 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:28.053 19:59:12 -- setup/common.sh@18 -- # local node=0 00:03:28.053 19:59:12 -- setup/common.sh@19 -- # local var val 00:03:28.053 19:59:12 -- setup/common.sh@20 -- # local mem_f mem 00:03:28.053 19:59:12 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:28.054 19:59:12 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:28.054 19:59:12 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:28.054 19:59:12 -- setup/common.sh@28 -- # mapfile -t mem 00:03:28.054 19:59:12 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 34832216 kB' 'MemUsed: 13284748 kB' 'SwapCached: 0 kB' 'Active: 6178084 kB' 'Inactive: 3458068 kB' 'Active(anon): 5663256 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9345872 kB' 'Mapped: 70712 kB' 'AnonPages: 293440 kB' 'Shmem: 5372976 kB' 'KernelStack: 9560 kB' 'PageTables: 4088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149700 kB' 'Slab: 372732 kB' 'SReclaimable: 149700 kB' 'SUnreclaim: 223032 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # continue 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # IFS=': ' 00:03:28.054 19:59:12 -- setup/common.sh@31 -- # read -r var val _ 00:03:28.054 19:59:12 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:28.054 19:59:12 -- setup/common.sh@33 -- # echo 0 00:03:28.054 19:59:12 -- setup/common.sh@33 -- # return 0 00:03:28.055 19:59:12 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:28.055 19:59:12 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:28.055 19:59:12 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:28.055 19:59:12 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:28.055 19:59:12 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:28.055 node0=1024 expecting 1024 00:03:28.055 19:59:12 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:28.055 00:03:28.055 real 0m8.612s 00:03:28.055 user 0m1.784s 00:03:28.055 sys 0m3.624s 00:03:28.055 19:59:12 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:28.055 19:59:12 -- common/autotest_common.sh@10 -- # set +x 00:03:28.055 ************************************ 00:03:28.055 END TEST default_setup 00:03:28.055 ************************************ 00:03:28.055 19:59:12 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:28.055 19:59:12 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:28.055 19:59:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:28.055 19:59:12 -- common/autotest_common.sh@10 -- # set +x 00:03:28.055 ************************************ 00:03:28.055 START TEST per_node_1G_alloc 00:03:28.055 ************************************ 00:03:28.055 19:59:12 -- common/autotest_common.sh@1121 -- # per_node_1G_alloc 00:03:28.055 19:59:12 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:28.055 19:59:12 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:28.055 19:59:12 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:28.055 19:59:12 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:28.055 19:59:12 -- setup/hugepages.sh@51 -- # shift 00:03:28.055 19:59:12 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:28.055 19:59:12 -- setup/hugepages.sh@52 -- # local node_ids 00:03:28.055 19:59:12 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:28.055 19:59:12 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:28.055 19:59:12 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:28.055 19:59:12 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:28.055 19:59:12 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:28.055 19:59:12 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:28.055 19:59:12 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:28.055 19:59:12 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:28.055 19:59:12 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:28.055 19:59:12 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:28.055 19:59:12 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:28.055 19:59:12 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:28.055 19:59:12 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:28.055 19:59:12 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:28.055 19:59:12 -- setup/hugepages.sh@73 -- # return 0 00:03:28.055 19:59:12 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:28.055 19:59:12 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:28.055 19:59:12 -- setup/hugepages.sh@146 -- # setup output 00:03:28.055 19:59:12 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:28.055 19:59:12 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:31.340 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:31.340 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.340 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:31.341 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:33.249 19:59:17 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:33.249 19:59:17 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:33.249 19:59:17 -- setup/hugepages.sh@89 -- # local node 00:03:33.249 19:59:17 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:33.249 19:59:17 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:33.249 19:59:17 -- setup/hugepages.sh@92 -- # local surp 00:03:33.249 19:59:17 -- setup/hugepages.sh@93 -- # local resv 00:03:33.249 19:59:17 -- setup/hugepages.sh@94 -- # local anon 00:03:33.249 19:59:17 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:33.249 19:59:17 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:33.249 19:59:17 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:33.249 19:59:17 -- setup/common.sh@18 -- # local node= 00:03:33.249 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.249 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.249 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.249 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.249 19:59:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.249 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.249 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75712784 kB' 'MemAvailable: 79368720 kB' 'Buffers: 19064 kB' 'Cached: 11478884 kB' 'SwapCached: 0 kB' 'Active: 8533692 kB' 'Inactive: 3531884 kB' 'Active(anon): 7868176 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570932 kB' 'Mapped: 201832 kB' 'Shmem: 7300548 kB' 'KReclaimable: 224316 kB' 'Slab: 602488 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378172 kB' 'KernelStack: 16240 kB' 'PageTables: 8628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9238460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210884 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.249 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.249 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.250 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:33.250 19:59:17 -- setup/common.sh@33 -- # echo 0 00:03:33.250 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.250 19:59:17 -- setup/hugepages.sh@97 -- # anon=0 00:03:33.250 19:59:17 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:33.250 19:59:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.250 19:59:17 -- setup/common.sh@18 -- # local node= 00:03:33.250 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.250 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.250 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.250 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.250 19:59:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.250 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.250 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.250 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75713576 kB' 'MemAvailable: 79369512 kB' 'Buffers: 19064 kB' 'Cached: 11478888 kB' 'SwapCached: 0 kB' 'Active: 8533376 kB' 'Inactive: 3531884 kB' 'Active(anon): 7867860 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570640 kB' 'Mapped: 201804 kB' 'Shmem: 7300552 kB' 'KReclaimable: 224316 kB' 'Slab: 602520 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378204 kB' 'KernelStack: 16240 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9238472 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210868 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.251 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.251 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.252 19:59:17 -- setup/common.sh@33 -- # echo 0 00:03:33.252 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.252 19:59:17 -- setup/hugepages.sh@99 -- # surp=0 00:03:33.252 19:59:17 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:33.252 19:59:17 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:33.252 19:59:17 -- setup/common.sh@18 -- # local node= 00:03:33.252 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.252 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.252 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.252 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.252 19:59:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.252 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.252 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75714004 kB' 'MemAvailable: 79369940 kB' 'Buffers: 19064 kB' 'Cached: 11478900 kB' 'SwapCached: 0 kB' 'Active: 8533408 kB' 'Inactive: 3531884 kB' 'Active(anon): 7867892 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570640 kB' 'Mapped: 201804 kB' 'Shmem: 7300564 kB' 'KReclaimable: 224316 kB' 'Slab: 602512 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378196 kB' 'KernelStack: 16240 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9238484 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210852 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.252 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.252 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:33.253 19:59:17 -- setup/common.sh@33 -- # echo 0 00:03:33.253 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.253 19:59:17 -- setup/hugepages.sh@100 -- # resv=0 00:03:33.253 19:59:17 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:33.253 nr_hugepages=1024 00:03:33.253 19:59:17 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:33.253 resv_hugepages=0 00:03:33.253 19:59:17 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:33.253 surplus_hugepages=0 00:03:33.253 19:59:17 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:33.253 anon_hugepages=0 00:03:33.253 19:59:17 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.253 19:59:17 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:33.253 19:59:17 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:33.253 19:59:17 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:33.253 19:59:17 -- setup/common.sh@18 -- # local node= 00:03:33.253 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.253 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.253 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.253 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:33.253 19:59:17 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:33.253 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.253 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75714692 kB' 'MemAvailable: 79370628 kB' 'Buffers: 19064 kB' 'Cached: 11478916 kB' 'SwapCached: 0 kB' 'Active: 8533420 kB' 'Inactive: 3531884 kB' 'Active(anon): 7867904 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570640 kB' 'Mapped: 201804 kB' 'Shmem: 7300580 kB' 'KReclaimable: 224316 kB' 'Slab: 602512 kB' 'SReclaimable: 224316 kB' 'SUnreclaim: 378196 kB' 'KernelStack: 16240 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9238500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210852 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.253 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.253 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.254 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.254 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.255 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.255 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:33.255 19:59:17 -- setup/common.sh@33 -- # echo 1024 00:03:33.255 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.255 19:59:17 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:33.255 19:59:17 -- setup/hugepages.sh@112 -- # get_nodes 00:03:33.255 19:59:17 -- setup/hugepages.sh@27 -- # local node 00:03:33.255 19:59:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.255 19:59:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.255 19:59:17 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:33.255 19:59:17 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:33.255 19:59:17 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:33.516 19:59:17 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:33.516 19:59:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.516 19:59:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.516 19:59:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:33.516 19:59:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.516 19:59:17 -- setup/common.sh@18 -- # local node=0 00:03:33.516 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.516 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.516 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.516 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:33.516 19:59:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:33.516 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.516 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 35900828 kB' 'MemUsed: 12216136 kB' 'SwapCached: 0 kB' 'Active: 6176764 kB' 'Inactive: 3458068 kB' 'Active(anon): 5661936 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9345912 kB' 'Mapped: 69680 kB' 'AnonPages: 292032 kB' 'Shmem: 5373016 kB' 'KernelStack: 9560 kB' 'PageTables: 4044 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149700 kB' 'Slab: 372324 kB' 'SReclaimable: 149700 kB' 'SUnreclaim: 222624 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.516 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.516 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@33 -- # echo 0 00:03:33.517 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.517 19:59:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.517 19:59:17 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:33.517 19:59:17 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:33.517 19:59:17 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:33.517 19:59:17 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:33.517 19:59:17 -- setup/common.sh@18 -- # local node=1 00:03:33.517 19:59:17 -- setup/common.sh@19 -- # local var val 00:03:33.517 19:59:17 -- setup/common.sh@20 -- # local mem_f mem 00:03:33.517 19:59:17 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:33.517 19:59:17 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:33.517 19:59:17 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:33.517 19:59:17 -- setup/common.sh@28 -- # mapfile -t mem 00:03:33.517 19:59:17 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 39814368 kB' 'MemUsed: 4362176 kB' 'SwapCached: 0 kB' 'Active: 2356996 kB' 'Inactive: 73816 kB' 'Active(anon): 2206308 kB' 'Inactive(anon): 0 kB' 'Active(file): 150688 kB' 'Inactive(file): 73816 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2152096 kB' 'Mapped: 132124 kB' 'AnonPages: 278964 kB' 'Shmem: 1927592 kB' 'KernelStack: 6696 kB' 'PageTables: 4660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 74616 kB' 'Slab: 230180 kB' 'SReclaimable: 74616 kB' 'SUnreclaim: 155564 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.517 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.517 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # continue 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # IFS=': ' 00:03:33.518 19:59:17 -- setup/common.sh@31 -- # read -r var val _ 00:03:33.518 19:59:17 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:33.518 19:59:17 -- setup/common.sh@33 -- # echo 0 00:03:33.518 19:59:17 -- setup/common.sh@33 -- # return 0 00:03:33.518 19:59:17 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.518 19:59:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.518 19:59:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.518 19:59:17 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:33.518 node0=512 expecting 512 00:03:33.518 19:59:17 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:33.518 19:59:17 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:33.518 19:59:17 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:33.518 19:59:17 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:33.518 node1=512 expecting 512 00:03:33.518 19:59:17 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:33.518 00:03:33.518 real 0m5.256s 00:03:33.518 user 0m1.617s 00:03:33.518 sys 0m3.538s 00:03:33.518 19:59:17 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:33.518 19:59:17 -- common/autotest_common.sh@10 -- # set +x 00:03:33.518 ************************************ 00:03:33.518 END TEST per_node_1G_alloc 00:03:33.518 ************************************ 00:03:33.518 19:59:17 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:33.518 19:59:17 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:33.518 19:59:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:33.518 19:59:17 -- common/autotest_common.sh@10 -- # set +x 00:03:33.518 ************************************ 00:03:33.518 START TEST even_2G_alloc 00:03:33.518 ************************************ 00:03:33.518 19:59:17 -- common/autotest_common.sh@1121 -- # even_2G_alloc 00:03:33.518 19:59:17 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:33.518 19:59:17 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:33.518 19:59:17 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:33.518 19:59:17 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:33.518 19:59:17 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:33.518 19:59:17 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:33.518 19:59:17 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:33.518 19:59:17 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:33.518 19:59:17 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:33.518 19:59:17 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:33.518 19:59:17 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.518 19:59:17 -- setup/hugepages.sh@83 -- # : 512 00:03:33.518 19:59:17 -- setup/hugepages.sh@84 -- # : 1 00:03:33.518 19:59:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:33.518 19:59:17 -- setup/hugepages.sh@83 -- # : 0 00:03:33.518 19:59:17 -- setup/hugepages.sh@84 -- # : 0 00:03:33.518 19:59:17 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:33.518 19:59:17 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:33.518 19:59:17 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:33.518 19:59:17 -- setup/hugepages.sh@153 -- # setup output 00:03:33.518 19:59:17 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:33.518 19:59:17 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:37.713 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:37.714 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:37.714 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:39.621 19:59:23 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:39.621 19:59:23 -- setup/hugepages.sh@89 -- # local node 00:03:39.621 19:59:23 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.621 19:59:23 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.621 19:59:23 -- setup/hugepages.sh@92 -- # local surp 00:03:39.621 19:59:23 -- setup/hugepages.sh@93 -- # local resv 00:03:39.621 19:59:23 -- setup/hugepages.sh@94 -- # local anon 00:03:39.621 19:59:23 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.621 19:59:23 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.621 19:59:23 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.621 19:59:23 -- setup/common.sh@18 -- # local node= 00:03:39.621 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.621 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.621 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.621 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.621 19:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.621 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.621 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75704936 kB' 'MemAvailable: 79360856 kB' 'Buffers: 19064 kB' 'Cached: 11479044 kB' 'SwapCached: 0 kB' 'Active: 8535640 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870124 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572712 kB' 'Mapped: 201920 kB' 'Shmem: 7300708 kB' 'KReclaimable: 224284 kB' 'Slab: 602848 kB' 'SReclaimable: 224284 kB' 'SUnreclaim: 378564 kB' 'KernelStack: 16416 kB' 'PageTables: 8596 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241980 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210964 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.621 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.621 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.622 19:59:23 -- setup/common.sh@33 -- # echo 0 00:03:39.622 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.622 19:59:23 -- setup/hugepages.sh@97 -- # anon=0 00:03:39.622 19:59:23 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.622 19:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.622 19:59:23 -- setup/common.sh@18 -- # local node= 00:03:39.622 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.622 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.622 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.622 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.622 19:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.622 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.622 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75704800 kB' 'MemAvailable: 79360700 kB' 'Buffers: 19064 kB' 'Cached: 11479044 kB' 'SwapCached: 0 kB' 'Active: 8535360 kB' 'Inactive: 3531884 kB' 'Active(anon): 7869844 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572464 kB' 'Mapped: 201892 kB' 'Shmem: 7300708 kB' 'KReclaimable: 224244 kB' 'Slab: 602792 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378548 kB' 'KernelStack: 16160 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9240596 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210916 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.622 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.622 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.623 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.623 19:59:23 -- setup/common.sh@33 -- # echo 0 00:03:39.623 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.623 19:59:23 -- setup/hugepages.sh@99 -- # surp=0 00:03:39.623 19:59:23 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.623 19:59:23 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.623 19:59:23 -- setup/common.sh@18 -- # local node= 00:03:39.623 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.623 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.623 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.623 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.623 19:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.623 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.623 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.623 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75705168 kB' 'MemAvailable: 79361068 kB' 'Buffers: 19064 kB' 'Cached: 11479056 kB' 'SwapCached: 0 kB' 'Active: 8535996 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870480 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573080 kB' 'Mapped: 201896 kB' 'Shmem: 7300720 kB' 'KReclaimable: 224244 kB' 'Slab: 602856 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378612 kB' 'KernelStack: 16448 kB' 'PageTables: 9248 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9242008 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 211028 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.624 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.624 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.625 19:59:23 -- setup/common.sh@33 -- # echo 0 00:03:39.625 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.625 19:59:23 -- setup/hugepages.sh@100 -- # resv=0 00:03:39.625 19:59:23 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.625 nr_hugepages=1024 00:03:39.625 19:59:23 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.625 resv_hugepages=0 00:03:39.625 19:59:23 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.625 surplus_hugepages=0 00:03:39.625 19:59:23 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.625 anon_hugepages=0 00:03:39.625 19:59:23 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.625 19:59:23 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.625 19:59:23 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.625 19:59:23 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.625 19:59:23 -- setup/common.sh@18 -- # local node= 00:03:39.625 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.625 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.625 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.625 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.625 19:59:23 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.625 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.625 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75703584 kB' 'MemAvailable: 79359484 kB' 'Buffers: 19064 kB' 'Cached: 11479056 kB' 'SwapCached: 0 kB' 'Active: 8536036 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870520 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573120 kB' 'Mapped: 201896 kB' 'Shmem: 7300720 kB' 'KReclaimable: 224244 kB' 'Slab: 602856 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378612 kB' 'KernelStack: 16368 kB' 'PageTables: 9128 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9242020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210980 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.625 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.625 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.626 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.626 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.626 19:59:23 -- setup/common.sh@33 -- # echo 1024 00:03:39.626 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.626 19:59:23 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.626 19:59:23 -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.626 19:59:23 -- setup/hugepages.sh@27 -- # local node 00:03:39.626 19:59:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.626 19:59:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.626 19:59:23 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.626 19:59:23 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:39.626 19:59:23 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.626 19:59:23 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.626 19:59:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.626 19:59:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.626 19:59:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.626 19:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.626 19:59:23 -- setup/common.sh@18 -- # local node=0 00:03:39.626 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.626 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.626 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.626 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.627 19:59:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.627 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.627 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 35903772 kB' 'MemUsed: 12213192 kB' 'SwapCached: 0 kB' 'Active: 6179368 kB' 'Inactive: 3458068 kB' 'Active(anon): 5664540 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9345944 kB' 'Mapped: 69688 kB' 'AnonPages: 294672 kB' 'Shmem: 5373048 kB' 'KernelStack: 9608 kB' 'PageTables: 4184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149668 kB' 'Slab: 372460 kB' 'SReclaimable: 149668 kB' 'SUnreclaim: 222792 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.627 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.627 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.627 19:59:23 -- setup/common.sh@33 -- # echo 0 00:03:39.627 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.628 19:59:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.628 19:59:23 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.628 19:59:23 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.628 19:59:23 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:39.628 19:59:23 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.628 19:59:23 -- setup/common.sh@18 -- # local node=1 00:03:39.628 19:59:23 -- setup/common.sh@19 -- # local var val 00:03:39.628 19:59:23 -- setup/common.sh@20 -- # local mem_f mem 00:03:39.628 19:59:23 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.628 19:59:23 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:39.628 19:59:23 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:39.628 19:59:23 -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.628 19:59:23 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 39800456 kB' 'MemUsed: 4376088 kB' 'SwapCached: 0 kB' 'Active: 2356036 kB' 'Inactive: 73816 kB' 'Active(anon): 2205348 kB' 'Inactive(anon): 0 kB' 'Active(file): 150688 kB' 'Inactive(file): 73816 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2152220 kB' 'Mapped: 132192 kB' 'AnonPages: 277828 kB' 'Shmem: 1927716 kB' 'KernelStack: 6648 kB' 'PageTables: 4500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 74576 kB' 'Slab: 230428 kB' 'SReclaimable: 74576 kB' 'SUnreclaim: 155852 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.628 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.628 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.629 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.629 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.629 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.629 19:59:23 -- setup/common.sh@32 -- # continue 00:03:39.629 19:59:23 -- setup/common.sh@31 -- # IFS=': ' 00:03:39.629 19:59:23 -- setup/common.sh@31 -- # read -r var val _ 00:03:39.629 19:59:23 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.629 19:59:23 -- setup/common.sh@33 -- # echo 0 00:03:39.629 19:59:23 -- setup/common.sh@33 -- # return 0 00:03:39.629 19:59:23 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.629 19:59:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.629 19:59:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.629 19:59:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.629 19:59:23 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:39.629 node0=512 expecting 512 00:03:39.629 19:59:23 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.629 19:59:23 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.629 19:59:23 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.629 19:59:23 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:39.629 node1=512 expecting 512 00:03:39.629 19:59:23 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:39.629 00:03:39.629 real 0m5.944s 00:03:39.629 user 0m2.090s 00:03:39.629 sys 0m3.915s 00:03:39.629 19:59:23 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:39.629 19:59:23 -- common/autotest_common.sh@10 -- # set +x 00:03:39.629 ************************************ 00:03:39.629 END TEST even_2G_alloc 00:03:39.629 ************************************ 00:03:39.629 19:59:23 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:39.629 19:59:23 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:39.629 19:59:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:39.629 19:59:23 -- common/autotest_common.sh@10 -- # set +x 00:03:39.888 ************************************ 00:03:39.888 START TEST odd_alloc 00:03:39.888 ************************************ 00:03:39.888 19:59:24 -- common/autotest_common.sh@1121 -- # odd_alloc 00:03:39.888 19:59:24 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:39.888 19:59:24 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:39.888 19:59:24 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:39.888 19:59:24 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:39.888 19:59:24 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:39.888 19:59:24 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.888 19:59:24 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:39.888 19:59:24 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.888 19:59:24 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.888 19:59:24 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.888 19:59:24 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:39.888 19:59:24 -- setup/hugepages.sh@83 -- # : 513 00:03:39.888 19:59:24 -- setup/hugepages.sh@84 -- # : 1 00:03:39.888 19:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:39.888 19:59:24 -- setup/hugepages.sh@83 -- # : 0 00:03:39.888 19:59:24 -- setup/hugepages.sh@84 -- # : 0 00:03:39.888 19:59:24 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:39.888 19:59:24 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:39.888 19:59:24 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:39.888 19:59:24 -- setup/hugepages.sh@160 -- # setup output 00:03:39.888 19:59:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.888 19:59:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:43.177 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:43.177 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:43.177 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:43.436 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:43.436 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:43.436 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:43.436 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:45.352 19:59:29 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:45.352 19:59:29 -- setup/hugepages.sh@89 -- # local node 00:03:45.352 19:59:29 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:45.352 19:59:29 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:45.352 19:59:29 -- setup/hugepages.sh@92 -- # local surp 00:03:45.352 19:59:29 -- setup/hugepages.sh@93 -- # local resv 00:03:45.352 19:59:29 -- setup/hugepages.sh@94 -- # local anon 00:03:45.352 19:59:29 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:45.352 19:59:29 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:45.352 19:59:29 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:45.352 19:59:29 -- setup/common.sh@18 -- # local node= 00:03:45.352 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.352 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.352 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.352 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.352 19:59:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.352 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.352 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75702128 kB' 'MemAvailable: 79358028 kB' 'Buffers: 19064 kB' 'Cached: 11479204 kB' 'SwapCached: 0 kB' 'Active: 8537008 kB' 'Inactive: 3531884 kB' 'Active(anon): 7871492 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573544 kB' 'Mapped: 202072 kB' 'Shmem: 7300868 kB' 'KReclaimable: 224244 kB' 'Slab: 602488 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378244 kB' 'KernelStack: 16272 kB' 'PageTables: 8772 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 9240236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210820 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.352 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.352 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:45.353 19:59:29 -- setup/common.sh@33 -- # echo 0 00:03:45.353 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.353 19:59:29 -- setup/hugepages.sh@97 -- # anon=0 00:03:45.353 19:59:29 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:45.353 19:59:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.353 19:59:29 -- setup/common.sh@18 -- # local node= 00:03:45.353 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.353 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.353 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.353 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.353 19:59:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.353 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.353 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75702908 kB' 'MemAvailable: 79358808 kB' 'Buffers: 19064 kB' 'Cached: 11479208 kB' 'SwapCached: 0 kB' 'Active: 8535900 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870384 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572856 kB' 'Mapped: 201980 kB' 'Shmem: 7300872 kB' 'KReclaimable: 224244 kB' 'Slab: 602468 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378224 kB' 'KernelStack: 16240 kB' 'PageTables: 8660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 9240248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.353 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.353 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.354 19:59:29 -- setup/common.sh@33 -- # echo 0 00:03:45.354 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.354 19:59:29 -- setup/hugepages.sh@99 -- # surp=0 00:03:45.354 19:59:29 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:45.354 19:59:29 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:45.354 19:59:29 -- setup/common.sh@18 -- # local node= 00:03:45.354 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.354 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.354 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.354 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.354 19:59:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.354 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.354 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.354 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75702908 kB' 'MemAvailable: 79358808 kB' 'Buffers: 19064 kB' 'Cached: 11479220 kB' 'SwapCached: 0 kB' 'Active: 8535968 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870452 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572856 kB' 'Mapped: 201980 kB' 'Shmem: 7300884 kB' 'KReclaimable: 224244 kB' 'Slab: 602468 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378224 kB' 'KernelStack: 16240 kB' 'PageTables: 8660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 9240264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210788 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.354 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.354 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.355 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.355 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:45.356 19:59:29 -- setup/common.sh@33 -- # echo 0 00:03:45.356 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.356 19:59:29 -- setup/hugepages.sh@100 -- # resv=0 00:03:45.356 19:59:29 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:45.356 nr_hugepages=1025 00:03:45.356 19:59:29 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:45.356 resv_hugepages=0 00:03:45.356 19:59:29 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:45.356 surplus_hugepages=0 00:03:45.356 19:59:29 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:45.356 anon_hugepages=0 00:03:45.356 19:59:29 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:45.356 19:59:29 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:45.356 19:59:29 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:45.356 19:59:29 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:45.356 19:59:29 -- setup/common.sh@18 -- # local node= 00:03:45.356 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.356 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.356 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.356 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.356 19:59:29 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.356 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.356 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.356 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75702908 kB' 'MemAvailable: 79358808 kB' 'Buffers: 19064 kB' 'Cached: 11479244 kB' 'SwapCached: 0 kB' 'Active: 8535920 kB' 'Inactive: 3531884 kB' 'Active(anon): 7870404 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 572780 kB' 'Mapped: 201980 kB' 'Shmem: 7300908 kB' 'KReclaimable: 224244 kB' 'Slab: 602468 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378224 kB' 'KernelStack: 16224 kB' 'PageTables: 8604 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 9240276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.356 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.356 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.357 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.357 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:45.357 19:59:29 -- setup/common.sh@33 -- # echo 1025 00:03:45.357 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.357 19:59:29 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:45.357 19:59:29 -- setup/hugepages.sh@112 -- # get_nodes 00:03:45.357 19:59:29 -- setup/hugepages.sh@27 -- # local node 00:03:45.357 19:59:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.357 19:59:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:45.357 19:59:29 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.357 19:59:29 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:45.357 19:59:29 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:45.357 19:59:29 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:45.357 19:59:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.357 19:59:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.357 19:59:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:45.357 19:59:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.357 19:59:29 -- setup/common.sh@18 -- # local node=0 00:03:45.357 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.357 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.357 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.357 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:45.357 19:59:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:45.357 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.358 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 35901788 kB' 'MemUsed: 12215176 kB' 'SwapCached: 0 kB' 'Active: 6178672 kB' 'Inactive: 3458068 kB' 'Active(anon): 5663844 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9345984 kB' 'Mapped: 69692 kB' 'AnonPages: 293848 kB' 'Shmem: 5373088 kB' 'KernelStack: 9560 kB' 'PageTables: 4068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149668 kB' 'Slab: 372356 kB' 'SReclaimable: 149668 kB' 'SUnreclaim: 222688 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.358 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.358 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.358 19:59:29 -- setup/common.sh@33 -- # echo 0 00:03:45.358 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.358 19:59:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.358 19:59:29 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:45.358 19:59:29 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:45.358 19:59:29 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:45.359 19:59:29 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:45.359 19:59:29 -- setup/common.sh@18 -- # local node=1 00:03:45.359 19:59:29 -- setup/common.sh@19 -- # local var val 00:03:45.359 19:59:29 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.359 19:59:29 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.359 19:59:29 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:45.359 19:59:29 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:45.359 19:59:29 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.359 19:59:29 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 39800112 kB' 'MemUsed: 4376432 kB' 'SwapCached: 0 kB' 'Active: 2357652 kB' 'Inactive: 73816 kB' 'Active(anon): 2206964 kB' 'Inactive(anon): 0 kB' 'Active(file): 150688 kB' 'Inactive(file): 73816 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2152340 kB' 'Mapped: 132288 kB' 'AnonPages: 279332 kB' 'Shmem: 1927836 kB' 'KernelStack: 6696 kB' 'PageTables: 4644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 74576 kB' 'Slab: 230112 kB' 'SReclaimable: 74576 kB' 'SUnreclaim: 155536 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.359 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.359 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.360 19:59:29 -- setup/common.sh@32 -- # continue 00:03:45.360 19:59:29 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.360 19:59:29 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.360 19:59:29 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:45.360 19:59:29 -- setup/common.sh@33 -- # echo 0 00:03:45.360 19:59:29 -- setup/common.sh@33 -- # return 0 00:03:45.360 19:59:29 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:45.360 19:59:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.360 19:59:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.360 19:59:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.360 19:59:29 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:45.360 node0=512 expecting 513 00:03:45.360 19:59:29 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:45.360 19:59:29 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:45.360 19:59:29 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:45.360 19:59:29 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:45.360 node1=513 expecting 512 00:03:45.360 19:59:29 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:45.360 00:03:45.360 real 0m5.648s 00:03:45.360 user 0m1.994s 00:03:45.360 sys 0m3.675s 00:03:45.360 19:59:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:45.360 19:59:29 -- common/autotest_common.sh@10 -- # set +x 00:03:45.360 ************************************ 00:03:45.360 END TEST odd_alloc 00:03:45.360 ************************************ 00:03:45.360 19:59:29 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:45.360 19:59:29 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:45.360 19:59:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:45.360 19:59:29 -- common/autotest_common.sh@10 -- # set +x 00:03:45.747 ************************************ 00:03:45.747 START TEST custom_alloc 00:03:45.747 ************************************ 00:03:45.747 19:59:29 -- common/autotest_common.sh@1121 -- # custom_alloc 00:03:45.747 19:59:29 -- setup/hugepages.sh@167 -- # local IFS=, 00:03:45.747 19:59:29 -- setup/hugepages.sh@169 -- # local node 00:03:45.747 19:59:29 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:45.747 19:59:29 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:45.747 19:59:29 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:45.747 19:59:29 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:45.747 19:59:29 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:45.747 19:59:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.747 19:59:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:45.747 19:59:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.747 19:59:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:45.747 19:59:29 -- setup/hugepages.sh@83 -- # : 256 00:03:45.747 19:59:29 -- setup/hugepages.sh@84 -- # : 1 00:03:45.747 19:59:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:45.747 19:59:29 -- setup/hugepages.sh@83 -- # : 0 00:03:45.747 19:59:29 -- setup/hugepages.sh@84 -- # : 0 00:03:45.747 19:59:29 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:45.747 19:59:29 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:45.747 19:59:29 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:45.747 19:59:29 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:45.747 19:59:29 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.747 19:59:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:45.747 19:59:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.747 19:59:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:45.747 19:59:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:45.747 19:59:29 -- setup/hugepages.sh@78 -- # return 0 00:03:45.747 19:59:29 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:45.747 19:59:29 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:45.747 19:59:29 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:45.747 19:59:29 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:45.747 19:59:29 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:45.747 19:59:29 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.747 19:59:29 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:45.747 19:59:29 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.747 19:59:29 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.747 19:59:29 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:45.747 19:59:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:45.747 19:59:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:45.747 19:59:29 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:45.747 19:59:29 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:45.747 19:59:29 -- setup/hugepages.sh@78 -- # return 0 00:03:45.747 19:59:29 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:45.747 19:59:29 -- setup/hugepages.sh@187 -- # setup output 00:03:45.747 19:59:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.747 19:59:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:49.051 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:49.051 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.051 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:50.960 19:59:35 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:50.960 19:59:35 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:50.960 19:59:35 -- setup/hugepages.sh@89 -- # local node 00:03:50.960 19:59:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:50.960 19:59:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:50.960 19:59:35 -- setup/hugepages.sh@92 -- # local surp 00:03:50.960 19:59:35 -- setup/hugepages.sh@93 -- # local resv 00:03:50.960 19:59:35 -- setup/hugepages.sh@94 -- # local anon 00:03:50.960 19:59:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:50.960 19:59:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:50.960 19:59:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:50.960 19:59:35 -- setup/common.sh@18 -- # local node= 00:03:50.960 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.960 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.960 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.960 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.960 19:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.960 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.960 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74692464 kB' 'MemAvailable: 78348364 kB' 'Buffers: 19064 kB' 'Cached: 11479360 kB' 'SwapCached: 0 kB' 'Active: 8536616 kB' 'Inactive: 3531884 kB' 'Active(anon): 7871100 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573464 kB' 'Mapped: 202088 kB' 'Shmem: 7301024 kB' 'KReclaimable: 224244 kB' 'Slab: 602092 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 377848 kB' 'KernelStack: 16256 kB' 'PageTables: 8700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 9241032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.960 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.960 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.961 19:59:35 -- setup/common.sh@33 -- # echo 0 00:03:50.961 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.961 19:59:35 -- setup/hugepages.sh@97 -- # anon=0 00:03:50.961 19:59:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:50.961 19:59:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.961 19:59:35 -- setup/common.sh@18 -- # local node= 00:03:50.961 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.961 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.961 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.961 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.961 19:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.961 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.961 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74692680 kB' 'MemAvailable: 78348580 kB' 'Buffers: 19064 kB' 'Cached: 11479364 kB' 'SwapCached: 0 kB' 'Active: 8536528 kB' 'Inactive: 3531884 kB' 'Active(anon): 7871012 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573380 kB' 'Mapped: 202068 kB' 'Shmem: 7301028 kB' 'KReclaimable: 224244 kB' 'Slab: 602084 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 377840 kB' 'KernelStack: 16240 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 9241044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.961 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.961 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.962 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.962 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.962 19:59:35 -- setup/common.sh@33 -- # echo 0 00:03:50.962 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.962 19:59:35 -- setup/hugepages.sh@99 -- # surp=0 00:03:50.962 19:59:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:50.963 19:59:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:50.963 19:59:35 -- setup/common.sh@18 -- # local node= 00:03:50.963 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.963 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.963 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.963 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.963 19:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.963 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.963 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74692176 kB' 'MemAvailable: 78348076 kB' 'Buffers: 19064 kB' 'Cached: 11479364 kB' 'SwapCached: 0 kB' 'Active: 8536528 kB' 'Inactive: 3531884 kB' 'Active(anon): 7871012 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573380 kB' 'Mapped: 202068 kB' 'Shmem: 7301028 kB' 'KReclaimable: 224244 kB' 'Slab: 602084 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 377840 kB' 'KernelStack: 16240 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 9241060 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.963 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.963 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.964 19:59:35 -- setup/common.sh@33 -- # echo 0 00:03:50.964 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.964 19:59:35 -- setup/hugepages.sh@100 -- # resv=0 00:03:50.964 19:59:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:50.964 nr_hugepages=1536 00:03:50.964 19:59:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:50.964 resv_hugepages=0 00:03:50.964 19:59:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:50.964 surplus_hugepages=0 00:03:50.964 19:59:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:50.964 anon_hugepages=0 00:03:50.964 19:59:35 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:50.964 19:59:35 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:50.964 19:59:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:50.964 19:59:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:50.964 19:59:35 -- setup/common.sh@18 -- # local node= 00:03:50.964 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.964 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.964 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.964 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.964 19:59:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.964 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.964 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.964 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 74693960 kB' 'MemAvailable: 78349860 kB' 'Buffers: 19064 kB' 'Cached: 11479392 kB' 'SwapCached: 0 kB' 'Active: 8536552 kB' 'Inactive: 3531884 kB' 'Active(anon): 7871036 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 573380 kB' 'Mapped: 202068 kB' 'Shmem: 7301056 kB' 'KReclaimable: 224244 kB' 'Slab: 602084 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 377840 kB' 'KernelStack: 16240 kB' 'PageTables: 8664 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 9241072 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.964 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.964 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.965 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.965 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.966 19:59:35 -- setup/common.sh@33 -- # echo 1536 00:03:50.966 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.966 19:59:35 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:50.966 19:59:35 -- setup/hugepages.sh@112 -- # get_nodes 00:03:50.966 19:59:35 -- setup/hugepages.sh@27 -- # local node 00:03:50.966 19:59:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.966 19:59:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:50.966 19:59:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.966 19:59:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:50.966 19:59:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.966 19:59:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.966 19:59:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.966 19:59:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.966 19:59:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:50.966 19:59:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.966 19:59:35 -- setup/common.sh@18 -- # local node=0 00:03:50.966 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.966 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.966 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.966 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:50.966 19:59:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:50.966 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.966 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 35904660 kB' 'MemUsed: 12212304 kB' 'SwapCached: 0 kB' 'Active: 6178720 kB' 'Inactive: 3458068 kB' 'Active(anon): 5663892 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9346016 kB' 'Mapped: 69700 kB' 'AnonPages: 293916 kB' 'Shmem: 5373120 kB' 'KernelStack: 9560 kB' 'PageTables: 4068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149668 kB' 'Slab: 371972 kB' 'SReclaimable: 149668 kB' 'SUnreclaim: 222304 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.966 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.966 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@33 -- # echo 0 00:03:50.967 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.967 19:59:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.967 19:59:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.967 19:59:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.967 19:59:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:50.967 19:59:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.967 19:59:35 -- setup/common.sh@18 -- # local node=1 00:03:50.967 19:59:35 -- setup/common.sh@19 -- # local var val 00:03:50.967 19:59:35 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.967 19:59:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.967 19:59:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:50.967 19:59:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:50.967 19:59:35 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.967 19:59:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44176544 kB' 'MemFree: 38789048 kB' 'MemUsed: 5387496 kB' 'SwapCached: 0 kB' 'Active: 2357508 kB' 'Inactive: 73816 kB' 'Active(anon): 2206820 kB' 'Inactive(anon): 0 kB' 'Active(file): 150688 kB' 'Inactive(file): 73816 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 2152472 kB' 'Mapped: 132368 kB' 'AnonPages: 279076 kB' 'Shmem: 1927968 kB' 'KernelStack: 6664 kB' 'PageTables: 4540 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 74576 kB' 'Slab: 230112 kB' 'SReclaimable: 74576 kB' 'SUnreclaim: 155536 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.967 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.967 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # continue 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.968 19:59:35 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.968 19:59:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.968 19:59:35 -- setup/common.sh@33 -- # echo 0 00:03:50.968 19:59:35 -- setup/common.sh@33 -- # return 0 00:03:50.968 19:59:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.968 19:59:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.968 19:59:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.968 19:59:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.968 19:59:35 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:50.968 node0=512 expecting 512 00:03:50.968 19:59:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.968 19:59:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.968 19:59:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.968 19:59:35 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:50.968 node1=1024 expecting 1024 00:03:50.968 19:59:35 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:50.968 00:03:50.968 real 0m5.412s 00:03:50.968 user 0m1.724s 00:03:50.968 sys 0m3.598s 00:03:50.968 19:59:35 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:03:50.968 19:59:35 -- common/autotest_common.sh@10 -- # set +x 00:03:50.968 ************************************ 00:03:50.968 END TEST custom_alloc 00:03:50.968 ************************************ 00:03:51.227 19:59:35 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:51.227 19:59:35 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:03:51.228 19:59:35 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:03:51.228 19:59:35 -- common/autotest_common.sh@10 -- # set +x 00:03:51.228 ************************************ 00:03:51.228 START TEST no_shrink_alloc 00:03:51.228 ************************************ 00:03:51.228 19:59:35 -- common/autotest_common.sh@1121 -- # no_shrink_alloc 00:03:51.228 19:59:35 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:51.228 19:59:35 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:51.228 19:59:35 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:51.228 19:59:35 -- setup/hugepages.sh@51 -- # shift 00:03:51.228 19:59:35 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:51.228 19:59:35 -- setup/hugepages.sh@52 -- # local node_ids 00:03:51.228 19:59:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:51.228 19:59:35 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:51.228 19:59:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:51.228 19:59:35 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:51.228 19:59:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:51.228 19:59:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:51.228 19:59:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:51.228 19:59:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:51.228 19:59:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:51.228 19:59:35 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:51.228 19:59:35 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:51.228 19:59:35 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:51.228 19:59:35 -- setup/hugepages.sh@73 -- # return 0 00:03:51.228 19:59:35 -- setup/hugepages.sh@198 -- # setup output 00:03:51.228 19:59:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:51.228 19:59:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:54.518 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:54.518 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:54.518 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:56.423 19:59:40 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:56.423 19:59:40 -- setup/hugepages.sh@89 -- # local node 00:03:56.423 19:59:40 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:56.423 19:59:40 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:56.423 19:59:40 -- setup/hugepages.sh@92 -- # local surp 00:03:56.423 19:59:40 -- setup/hugepages.sh@93 -- # local resv 00:03:56.423 19:59:40 -- setup/hugepages.sh@94 -- # local anon 00:03:56.423 19:59:40 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:56.423 19:59:40 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:56.423 19:59:40 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:56.423 19:59:40 -- setup/common.sh@18 -- # local node= 00:03:56.423 19:59:40 -- setup/common.sh@19 -- # local var val 00:03:56.423 19:59:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.423 19:59:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.423 19:59:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.423 19:59:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.423 19:59:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.423 19:59:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75719104 kB' 'MemAvailable: 79375004 kB' 'Buffers: 19064 kB' 'Cached: 11479512 kB' 'SwapCached: 0 kB' 'Active: 8538556 kB' 'Inactive: 3531884 kB' 'Active(anon): 7873040 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575036 kB' 'Mapped: 202236 kB' 'Shmem: 7301176 kB' 'KReclaimable: 224244 kB' 'Slab: 602776 kB' 'SReclaimable: 224244 kB' 'SUnreclaim: 378532 kB' 'KernelStack: 16224 kB' 'PageTables: 8612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241492 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210884 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.423 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.423 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.424 19:59:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.424 19:59:40 -- setup/common.sh@33 -- # echo 0 00:03:56.424 19:59:40 -- setup/common.sh@33 -- # return 0 00:03:56.424 19:59:40 -- setup/hugepages.sh@97 -- # anon=0 00:03:56.424 19:59:40 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:56.424 19:59:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.424 19:59:40 -- setup/common.sh@18 -- # local node= 00:03:56.424 19:59:40 -- setup/common.sh@19 -- # local var val 00:03:56.424 19:59:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.424 19:59:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.424 19:59:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.424 19:59:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.424 19:59:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.424 19:59:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.424 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75721016 kB' 'MemAvailable: 79376900 kB' 'Buffers: 19064 kB' 'Cached: 11479516 kB' 'SwapCached: 0 kB' 'Active: 8539080 kB' 'Inactive: 3531884 kB' 'Active(anon): 7873564 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575508 kB' 'Mapped: 202236 kB' 'Shmem: 7301180 kB' 'KReclaimable: 224212 kB' 'Slab: 602736 kB' 'SReclaimable: 224212 kB' 'SUnreclaim: 378524 kB' 'KernelStack: 16256 kB' 'PageTables: 8728 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241404 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.425 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.425 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.426 19:59:40 -- setup/common.sh@33 -- # echo 0 00:03:56.426 19:59:40 -- setup/common.sh@33 -- # return 0 00:03:56.426 19:59:40 -- setup/hugepages.sh@99 -- # surp=0 00:03:56.426 19:59:40 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:56.426 19:59:40 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:56.426 19:59:40 -- setup/common.sh@18 -- # local node= 00:03:56.426 19:59:40 -- setup/common.sh@19 -- # local var val 00:03:56.426 19:59:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.426 19:59:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.426 19:59:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.426 19:59:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.426 19:59:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.426 19:59:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75721524 kB' 'MemAvailable: 79377408 kB' 'Buffers: 19064 kB' 'Cached: 11479528 kB' 'SwapCached: 0 kB' 'Active: 8538644 kB' 'Inactive: 3531884 kB' 'Active(anon): 7873128 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574984 kB' 'Mapped: 202228 kB' 'Shmem: 7301192 kB' 'KReclaimable: 224212 kB' 'Slab: 602676 kB' 'SReclaimable: 224212 kB' 'SUnreclaim: 378464 kB' 'KernelStack: 16240 kB' 'PageTables: 8688 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9242448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.426 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.426 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.427 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.427 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.427 19:59:40 -- setup/common.sh@33 -- # echo 0 00:03:56.427 19:59:40 -- setup/common.sh@33 -- # return 0 00:03:56.427 19:59:40 -- setup/hugepages.sh@100 -- # resv=0 00:03:56.427 19:59:40 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:56.427 nr_hugepages=1024 00:03:56.427 19:59:40 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:56.427 resv_hugepages=0 00:03:56.427 19:59:40 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:56.427 surplus_hugepages=0 00:03:56.427 19:59:40 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:56.427 anon_hugepages=0 00:03:56.427 19:59:40 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.427 19:59:40 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:56.427 19:59:40 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:56.427 19:59:40 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:56.427 19:59:40 -- setup/common.sh@18 -- # local node= 00:03:56.427 19:59:40 -- setup/common.sh@19 -- # local var val 00:03:56.427 19:59:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.427 19:59:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.427 19:59:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.427 19:59:40 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.427 19:59:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.427 19:59:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.427 19:59:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75721516 kB' 'MemAvailable: 79377368 kB' 'Buffers: 19064 kB' 'Cached: 11479528 kB' 'SwapCached: 0 kB' 'Active: 8538420 kB' 'Inactive: 3531884 kB' 'Active(anon): 7872904 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 575208 kB' 'Mapped: 202244 kB' 'Shmem: 7301192 kB' 'KReclaimable: 224148 kB' 'Slab: 602512 kB' 'SReclaimable: 224148 kB' 'SUnreclaim: 378364 kB' 'KernelStack: 16208 kB' 'PageTables: 8600 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9242460 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210772 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:03:56.428 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.428 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.428 19:59:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.428 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.428 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.428 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.428 19:59:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.428 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.428 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.688 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.688 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.689 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.689 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.689 19:59:40 -- setup/common.sh@33 -- # echo 1024 00:03:56.689 19:59:40 -- setup/common.sh@33 -- # return 0 00:03:56.689 19:59:40 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.689 19:59:40 -- setup/hugepages.sh@112 -- # get_nodes 00:03:56.689 19:59:40 -- setup/hugepages.sh@27 -- # local node 00:03:56.689 19:59:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.689 19:59:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.689 19:59:40 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.689 19:59:40 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:56.689 19:59:40 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.689 19:59:40 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.689 19:59:40 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.689 19:59:40 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.689 19:59:40 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:56.689 19:59:40 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.689 19:59:40 -- setup/common.sh@18 -- # local node=0 00:03:56.689 19:59:40 -- setup/common.sh@19 -- # local var val 00:03:56.690 19:59:40 -- setup/common.sh@20 -- # local mem_f mem 00:03:56.690 19:59:40 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.690 19:59:40 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:56.690 19:59:40 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:56.690 19:59:40 -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.690 19:59:40 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 34855560 kB' 'MemUsed: 13261404 kB' 'SwapCached: 0 kB' 'Active: 6181380 kB' 'Inactive: 3458068 kB' 'Active(anon): 5666552 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9346052 kB' 'Mapped: 69796 kB' 'AnonPages: 296632 kB' 'Shmem: 5373156 kB' 'KernelStack: 9720 kB' 'PageTables: 4880 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149636 kB' 'Slab: 372280 kB' 'SReclaimable: 149636 kB' 'SUnreclaim: 222644 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # continue 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # IFS=': ' 00:03:56.690 19:59:40 -- setup/common.sh@31 -- # read -r var val _ 00:03:56.690 19:59:40 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.691 19:59:40 -- setup/common.sh@33 -- # echo 0 00:03:56.691 19:59:40 -- setup/common.sh@33 -- # return 0 00:03:56.691 19:59:40 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.691 19:59:40 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.691 19:59:40 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.691 19:59:40 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.691 19:59:40 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:56.691 node0=1024 expecting 1024 00:03:56.691 19:59:40 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:56.691 19:59:40 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:56.691 19:59:40 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:56.691 19:59:40 -- setup/hugepages.sh@202 -- # setup output 00:03:56.691 19:59:40 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.691 19:59:40 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:00.885 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:1a:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:00.885 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.885 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:02.792 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:02.792 19:59:46 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:02.792 19:59:46 -- setup/hugepages.sh@89 -- # local node 00:04:02.792 19:59:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:02.792 19:59:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:02.792 19:59:46 -- setup/hugepages.sh@92 -- # local surp 00:04:02.792 19:59:46 -- setup/hugepages.sh@93 -- # local resv 00:04:02.792 19:59:46 -- setup/hugepages.sh@94 -- # local anon 00:04:02.792 19:59:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:02.792 19:59:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:02.792 19:59:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:02.792 19:59:46 -- setup/common.sh@18 -- # local node= 00:04:02.792 19:59:46 -- setup/common.sh@19 -- # local var val 00:04:02.792 19:59:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.792 19:59:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.792 19:59:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.792 19:59:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.792 19:59:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.792 19:59:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75702744 kB' 'MemAvailable: 79358596 kB' 'Buffers: 19064 kB' 'Cached: 11479644 kB' 'SwapCached: 0 kB' 'Active: 8537956 kB' 'Inactive: 3531884 kB' 'Active(anon): 7872440 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574484 kB' 'Mapped: 202228 kB' 'Shmem: 7301308 kB' 'KReclaimable: 224148 kB' 'Slab: 602960 kB' 'SReclaimable: 224148 kB' 'SUnreclaim: 378812 kB' 'KernelStack: 16416 kB' 'PageTables: 9040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241876 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210804 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.792 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.792 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:02.793 19:59:46 -- setup/common.sh@33 -- # echo 0 00:04:02.793 19:59:46 -- setup/common.sh@33 -- # return 0 00:04:02.793 19:59:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:02.793 19:59:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:02.793 19:59:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.793 19:59:46 -- setup/common.sh@18 -- # local node= 00:04:02.793 19:59:46 -- setup/common.sh@19 -- # local var val 00:04:02.793 19:59:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.793 19:59:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.793 19:59:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.793 19:59:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.793 19:59:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.793 19:59:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75703396 kB' 'MemAvailable: 79359248 kB' 'Buffers: 19064 kB' 'Cached: 11479648 kB' 'SwapCached: 0 kB' 'Active: 8537948 kB' 'Inactive: 3531884 kB' 'Active(anon): 7872432 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574544 kB' 'Mapped: 202212 kB' 'Shmem: 7301312 kB' 'KReclaimable: 224148 kB' 'Slab: 602944 kB' 'SReclaimable: 224148 kB' 'SUnreclaim: 378796 kB' 'KernelStack: 16384 kB' 'PageTables: 9172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241520 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210756 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.793 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.793 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.794 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.794 19:59:46 -- setup/common.sh@33 -- # echo 0 00:04:02.794 19:59:46 -- setup/common.sh@33 -- # return 0 00:04:02.794 19:59:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:02.794 19:59:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:02.794 19:59:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:02.794 19:59:46 -- setup/common.sh@18 -- # local node= 00:04:02.794 19:59:46 -- setup/common.sh@19 -- # local var val 00:04:02.794 19:59:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.794 19:59:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.794 19:59:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.794 19:59:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.794 19:59:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.794 19:59:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.794 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75703884 kB' 'MemAvailable: 79359736 kB' 'Buffers: 19064 kB' 'Cached: 11479660 kB' 'SwapCached: 0 kB' 'Active: 8537568 kB' 'Inactive: 3531884 kB' 'Active(anon): 7872052 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574152 kB' 'Mapped: 202212 kB' 'Shmem: 7301324 kB' 'KReclaimable: 224148 kB' 'Slab: 602944 kB' 'SReclaimable: 224148 kB' 'SUnreclaim: 378796 kB' 'KernelStack: 16352 kB' 'PageTables: 8972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241668 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210756 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.795 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.795 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:02.796 19:59:46 -- setup/common.sh@33 -- # echo 0 00:04:02.796 19:59:46 -- setup/common.sh@33 -- # return 0 00:04:02.796 19:59:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:02.796 19:59:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:02.796 nr_hugepages=1024 00:04:02.796 19:59:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:02.796 resv_hugepages=0 00:04:02.796 19:59:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:02.796 surplus_hugepages=0 00:04:02.796 19:59:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:02.796 anon_hugepages=0 00:04:02.796 19:59:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.796 19:59:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:02.796 19:59:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:02.796 19:59:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:02.796 19:59:46 -- setup/common.sh@18 -- # local node= 00:04:02.796 19:59:46 -- setup/common.sh@19 -- # local var val 00:04:02.796 19:59:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.796 19:59:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.796 19:59:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:02.796 19:59:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:02.796 19:59:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.796 19:59:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.796 19:59:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293508 kB' 'MemFree: 75703888 kB' 'MemAvailable: 79359740 kB' 'Buffers: 19064 kB' 'Cached: 11479672 kB' 'SwapCached: 0 kB' 'Active: 8537644 kB' 'Inactive: 3531884 kB' 'Active(anon): 7872128 kB' 'Inactive(anon): 0 kB' 'Active(file): 665516 kB' 'Inactive(file): 3531884 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 574148 kB' 'Mapped: 202212 kB' 'Shmem: 7301336 kB' 'KReclaimable: 224148 kB' 'Slab: 602944 kB' 'SReclaimable: 224148 kB' 'SUnreclaim: 378796 kB' 'KernelStack: 16336 kB' 'PageTables: 8916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 9241680 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 210724 kB' 'VmallocChunk: 0 kB' 'Percpu: 53440 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 529852 kB' 'DirectMap2M: 11728896 kB' 'DirectMap1G: 89128960 kB' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.796 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.796 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.797 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.797 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:02.797 19:59:46 -- setup/common.sh@33 -- # echo 1024 00:04:02.797 19:59:46 -- setup/common.sh@33 -- # return 0 00:04:02.797 19:59:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:02.797 19:59:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:02.797 19:59:46 -- setup/hugepages.sh@27 -- # local node 00:04:02.797 19:59:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.797 19:59:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:02.797 19:59:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:02.798 19:59:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:02.798 19:59:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:02.798 19:59:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:02.798 19:59:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:02.798 19:59:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:02.798 19:59:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:02.798 19:59:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:02.798 19:59:46 -- setup/common.sh@18 -- # local node=0 00:04:02.798 19:59:46 -- setup/common.sh@19 -- # local var val 00:04:02.798 19:59:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:02.798 19:59:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:02.798 19:59:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:02.798 19:59:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:02.798 19:59:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:02.798 19:59:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48116964 kB' 'MemFree: 34852772 kB' 'MemUsed: 13264192 kB' 'SwapCached: 0 kB' 'Active: 6178996 kB' 'Inactive: 3458068 kB' 'Active(anon): 5664168 kB' 'Inactive(anon): 0 kB' 'Active(file): 514828 kB' 'Inactive(file): 3458068 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 9346084 kB' 'Mapped: 69704 kB' 'AnonPages: 294148 kB' 'Shmem: 5373188 kB' 'KernelStack: 9720 kB' 'PageTables: 4480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 149636 kB' 'Slab: 372608 kB' 'SReclaimable: 149636 kB' 'SUnreclaim: 222972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.798 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.798 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.799 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.799 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.799 19:59:46 -- setup/common.sh@32 -- # continue 00:04:02.799 19:59:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:02.799 19:59:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:02.799 19:59:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:02.799 19:59:46 -- setup/common.sh@33 -- # echo 0 00:04:02.799 19:59:46 -- setup/common.sh@33 -- # return 0 00:04:02.799 19:59:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:02.799 19:59:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:02.799 19:59:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:02.799 19:59:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:02.799 19:59:46 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:02.799 node0=1024 expecting 1024 00:04:02.799 19:59:46 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:02.799 00:04:02.799 real 0m11.420s 00:04:02.799 user 0m4.062s 00:04:02.799 sys 0m7.315s 00:04:02.799 19:59:46 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:02.799 19:59:46 -- common/autotest_common.sh@10 -- # set +x 00:04:02.799 ************************************ 00:04:02.799 END TEST no_shrink_alloc 00:04:02.799 ************************************ 00:04:02.799 19:59:47 -- setup/hugepages.sh@217 -- # clear_hp 00:04:02.799 19:59:47 -- setup/hugepages.sh@37 -- # local node hp 00:04:02.799 19:59:47 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.799 19:59:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.799 19:59:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.799 19:59:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.799 19:59:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.799 19:59:47 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:02.799 19:59:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.799 19:59:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.799 19:59:47 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:02.799 19:59:47 -- setup/hugepages.sh@41 -- # echo 0 00:04:02.799 19:59:47 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:02.799 19:59:47 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:02.799 00:04:02.799 real 0m43.718s 00:04:02.799 user 0m13.788s 00:04:02.799 sys 0m26.477s 00:04:02.799 19:59:47 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:02.799 19:59:47 -- common/autotest_common.sh@10 -- # set +x 00:04:02.799 ************************************ 00:04:02.799 END TEST hugepages 00:04:02.799 ************************************ 00:04:02.799 19:59:47 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:02.799 19:59:47 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:02.799 19:59:47 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:02.799 19:59:47 -- common/autotest_common.sh@10 -- # set +x 00:04:03.057 ************************************ 00:04:03.057 START TEST driver 00:04:03.057 ************************************ 00:04:03.057 19:59:47 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:03.057 * Looking for test storage... 00:04:03.057 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:03.057 19:59:47 -- setup/driver.sh@68 -- # setup reset 00:04:03.057 19:59:47 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:03.057 19:59:47 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:11.180 19:59:54 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:11.180 19:59:54 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:11.180 19:59:54 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:11.180 19:59:54 -- common/autotest_common.sh@10 -- # set +x 00:04:11.180 ************************************ 00:04:11.180 START TEST guess_driver 00:04:11.180 ************************************ 00:04:11.180 19:59:54 -- common/autotest_common.sh@1121 -- # guess_driver 00:04:11.180 19:59:54 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:11.180 19:59:54 -- setup/driver.sh@47 -- # local fail=0 00:04:11.180 19:59:54 -- setup/driver.sh@49 -- # pick_driver 00:04:11.180 19:59:54 -- setup/driver.sh@36 -- # vfio 00:04:11.180 19:59:54 -- setup/driver.sh@21 -- # local iommu_grups 00:04:11.180 19:59:54 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:11.180 19:59:54 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:11.180 19:59:54 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:11.180 19:59:54 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:11.180 19:59:54 -- setup/driver.sh@29 -- # (( 238 > 0 )) 00:04:11.180 19:59:54 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:11.180 19:59:54 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:11.180 19:59:54 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:11.180 19:59:54 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:11.180 19:59:54 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:11.180 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:11.180 19:59:54 -- setup/driver.sh@30 -- # return 0 00:04:11.180 19:59:54 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:11.180 19:59:54 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:11.180 19:59:54 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:11.180 19:59:54 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:11.180 Looking for driver=vfio-pci 00:04:11.180 19:59:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:11.180 19:59:54 -- setup/driver.sh@45 -- # setup output config 00:04:11.180 19:59:54 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:11.180 19:59:54 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.085 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.085 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.085 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.344 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.344 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.344 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:13.344 19:59:57 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:13.344 19:59:57 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:13.344 19:59:57 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:16.634 20:00:00 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:16.634 20:00:00 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:16.634 20:00:00 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:18.536 20:00:02 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:18.536 20:00:02 -- setup/driver.sh@65 -- # setup reset 00:04:18.536 20:00:02 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:18.536 20:00:02 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.649 00:04:26.649 real 0m15.389s 00:04:26.649 user 0m3.603s 00:04:26.649 sys 0m7.882s 00:04:26.649 20:00:09 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:26.649 20:00:09 -- common/autotest_common.sh@10 -- # set +x 00:04:26.649 ************************************ 00:04:26.649 END TEST guess_driver 00:04:26.649 ************************************ 00:04:26.649 00:04:26.649 real 0m22.519s 00:04:26.649 user 0m5.674s 00:04:26.649 sys 0m12.125s 00:04:26.649 20:00:09 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:26.649 20:00:09 -- common/autotest_common.sh@10 -- # set +x 00:04:26.649 ************************************ 00:04:26.649 END TEST driver 00:04:26.649 ************************************ 00:04:26.649 20:00:09 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:26.649 20:00:09 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:26.649 20:00:09 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:26.649 20:00:09 -- common/autotest_common.sh@10 -- # set +x 00:04:26.649 ************************************ 00:04:26.649 START TEST devices 00:04:26.649 ************************************ 00:04:26.649 20:00:09 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:26.649 * Looking for test storage... 00:04:26.649 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:26.649 20:00:10 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:26.649 20:00:10 -- setup/devices.sh@192 -- # setup reset 00:04:26.649 20:00:10 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:26.649 20:00:10 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.919 20:00:15 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:31.919 20:00:15 -- common/autotest_common.sh@1665 -- # zoned_devs=() 00:04:31.919 20:00:15 -- common/autotest_common.sh@1665 -- # local -gA zoned_devs 00:04:31.919 20:00:15 -- common/autotest_common.sh@1666 -- # local nvme bdf 00:04:31.919 20:00:15 -- common/autotest_common.sh@1668 -- # for nvme in /sys/block/nvme* 00:04:31.920 20:00:15 -- common/autotest_common.sh@1669 -- # is_block_zoned nvme0n1 00:04:31.920 20:00:15 -- common/autotest_common.sh@1658 -- # local device=nvme0n1 00:04:31.920 20:00:15 -- common/autotest_common.sh@1660 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:31.920 20:00:15 -- common/autotest_common.sh@1661 -- # [[ none != none ]] 00:04:31.920 20:00:15 -- setup/devices.sh@196 -- # blocks=() 00:04:31.920 20:00:15 -- setup/devices.sh@196 -- # declare -a blocks 00:04:31.920 20:00:15 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:31.920 20:00:15 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:31.920 20:00:15 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:31.920 20:00:15 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:31.920 20:00:15 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:31.920 20:00:15 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:31.920 20:00:15 -- setup/devices.sh@202 -- # pci=0000:1a:00.0 00:04:31.920 20:00:15 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\1\a\:\0\0\.\0* ]] 00:04:31.920 20:00:15 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:31.920 20:00:15 -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:04:31.920 20:00:15 -- scripts/common.sh@387 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:31.920 No valid GPT data, bailing 00:04:31.920 20:00:15 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:31.920 20:00:15 -- scripts/common.sh@391 -- # pt= 00:04:31.920 20:00:15 -- scripts/common.sh@392 -- # return 1 00:04:31.920 20:00:15 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:31.920 20:00:15 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:31.920 20:00:15 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:31.920 20:00:15 -- setup/common.sh@80 -- # echo 4000787030016 00:04:31.920 20:00:15 -- setup/devices.sh@204 -- # (( 4000787030016 >= min_disk_size )) 00:04:31.920 20:00:15 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:31.920 20:00:15 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:1a:00.0 00:04:31.920 20:00:15 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:31.920 20:00:15 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:31.920 20:00:15 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:31.920 20:00:15 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:31.920 20:00:15 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:31.920 20:00:15 -- common/autotest_common.sh@10 -- # set +x 00:04:31.920 ************************************ 00:04:31.920 START TEST nvme_mount 00:04:31.920 ************************************ 00:04:31.920 20:00:15 -- common/autotest_common.sh@1121 -- # nvme_mount 00:04:31.920 20:00:15 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:31.920 20:00:15 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:31.920 20:00:15 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.920 20:00:15 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:31.920 20:00:15 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:31.920 20:00:15 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:31.920 20:00:15 -- setup/common.sh@40 -- # local part_no=1 00:04:31.920 20:00:15 -- setup/common.sh@41 -- # local size=1073741824 00:04:31.920 20:00:15 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:31.920 20:00:15 -- setup/common.sh@44 -- # parts=() 00:04:31.920 20:00:15 -- setup/common.sh@44 -- # local parts 00:04:31.920 20:00:15 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:31.920 20:00:15 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.920 20:00:15 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:31.920 20:00:15 -- setup/common.sh@46 -- # (( part++ )) 00:04:31.920 20:00:15 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.920 20:00:15 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:31.920 20:00:15 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:31.920 20:00:15 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:32.862 Creating new GPT entries in memory. 00:04:32.862 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:32.862 other utilities. 00:04:32.862 20:00:17 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:32.862 20:00:17 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:32.862 20:00:17 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:32.862 20:00:17 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:32.862 20:00:17 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:33.803 Creating new GPT entries in memory. 00:04:33.803 The operation has completed successfully. 00:04:33.803 20:00:18 -- setup/common.sh@57 -- # (( part++ )) 00:04:33.803 20:00:18 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:33.803 20:00:18 -- setup/common.sh@62 -- # wait 1583912 00:04:33.803 20:00:18 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.803 20:00:18 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:33.803 20:00:18 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.803 20:00:18 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:33.803 20:00:18 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:33.803 20:00:18 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.803 20:00:18 -- setup/devices.sh@105 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:33.803 20:00:18 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:33.803 20:00:18 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:33.803 20:00:18 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:33.803 20:00:18 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:33.803 20:00:18 -- setup/devices.sh@53 -- # local found=0 00:04:33.803 20:00:18 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:33.803 20:00:18 -- setup/devices.sh@56 -- # : 00:04:33.803 20:00:18 -- setup/devices.sh@59 -- # local pci status 00:04:33.803 20:00:18 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:33.803 20:00:18 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:33.804 20:00:18 -- setup/devices.sh@47 -- # setup output config 00:04:33.804 20:00:18 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.804 20:00:18 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:38.155 20:00:21 -- setup/devices.sh@63 -- # found=1 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:38.155 20:00:21 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:38.155 20:00:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.532 20:00:23 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:39.532 20:00:23 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:39.532 20:00:23 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.532 20:00:23 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:39.532 20:00:23 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.532 20:00:23 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:39.532 20:00:23 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.532 20:00:23 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.532 20:00:23 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:39.532 20:00:23 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:39.532 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:39.532 20:00:23 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:39.532 20:00:23 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:39.795 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:39.795 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:39.795 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:39.795 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:39.795 20:00:24 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:39.795 20:00:24 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:39.795 20:00:24 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.796 20:00:24 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:39.796 20:00:24 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:39.796 20:00:24 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.796 20:00:24 -- setup/devices.sh@116 -- # verify 0000:1a:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.796 20:00:24 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:39.796 20:00:24 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:39.796 20:00:24 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.796 20:00:24 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.796 20:00:24 -- setup/devices.sh@53 -- # local found=0 00:04:39.796 20:00:24 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:39.796 20:00:24 -- setup/devices.sh@56 -- # : 00:04:39.796 20:00:24 -- setup/devices.sh@59 -- # local pci status 00:04:39.796 20:00:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.796 20:00:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:39.796 20:00:24 -- setup/devices.sh@47 -- # setup output config 00:04:39.796 20:00:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.796 20:00:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:43.115 20:00:27 -- setup/devices.sh@63 -- # found=1 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.115 20:00:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:43.115 20:00:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.018 20:00:29 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:45.018 20:00:29 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:45.018 20:00:29 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.018 20:00:29 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:45.018 20:00:29 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:45.018 20:00:29 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:45.018 20:00:29 -- setup/devices.sh@125 -- # verify 0000:1a:00.0 data@nvme0n1 '' '' 00:04:45.018 20:00:29 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:45.018 20:00:29 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:45.018 20:00:29 -- setup/devices.sh@50 -- # local mount_point= 00:04:45.018 20:00:29 -- setup/devices.sh@51 -- # local test_file= 00:04:45.018 20:00:29 -- setup/devices.sh@53 -- # local found=0 00:04:45.018 20:00:29 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:45.018 20:00:29 -- setup/devices.sh@59 -- # local pci status 00:04:45.018 20:00:29 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:45.018 20:00:29 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:45.018 20:00:29 -- setup/devices.sh@47 -- # setup output config 00:04:45.018 20:00:29 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:45.018 20:00:29 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:49.207 20:00:32 -- setup/devices.sh@63 -- # found=1 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.207 20:00:32 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:49.207 20:00:32 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:50.584 20:00:34 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:50.584 20:00:34 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:50.584 20:00:34 -- setup/devices.sh@68 -- # return 0 00:04:50.584 20:00:34 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:50.584 20:00:34 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:50.584 20:00:34 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:50.584 20:00:34 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:50.584 20:00:34 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:50.584 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:50.584 00:04:50.584 real 0m18.912s 00:04:50.584 user 0m5.758s 00:04:50.584 sys 0m10.924s 00:04:50.584 20:00:34 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:04:50.584 20:00:34 -- common/autotest_common.sh@10 -- # set +x 00:04:50.584 ************************************ 00:04:50.584 END TEST nvme_mount 00:04:50.584 ************************************ 00:04:50.584 20:00:34 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:50.584 20:00:34 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:04:50.584 20:00:34 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:04:50.584 20:00:34 -- common/autotest_common.sh@10 -- # set +x 00:04:50.850 ************************************ 00:04:50.850 START TEST dm_mount 00:04:50.850 ************************************ 00:04:50.850 20:00:35 -- common/autotest_common.sh@1121 -- # dm_mount 00:04:50.850 20:00:35 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:50.850 20:00:35 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:50.850 20:00:35 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:50.850 20:00:35 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:50.850 20:00:35 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:50.850 20:00:35 -- setup/common.sh@40 -- # local part_no=2 00:04:50.850 20:00:35 -- setup/common.sh@41 -- # local size=1073741824 00:04:50.850 20:00:35 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:50.850 20:00:35 -- setup/common.sh@44 -- # parts=() 00:04:50.850 20:00:35 -- setup/common.sh@44 -- # local parts 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.850 20:00:35 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part++ )) 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.850 20:00:35 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part++ )) 00:04:50.850 20:00:35 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:50.850 20:00:35 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:50.850 20:00:35 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:50.850 20:00:35 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:51.788 Creating new GPT entries in memory. 00:04:51.788 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:51.788 other utilities. 00:04:51.788 20:00:36 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:51.788 20:00:36 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:51.788 20:00:36 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:51.788 20:00:36 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:51.788 20:00:36 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:52.725 Creating new GPT entries in memory. 00:04:52.725 The operation has completed successfully. 00:04:52.725 20:00:37 -- setup/common.sh@57 -- # (( part++ )) 00:04:52.725 20:00:37 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:52.725 20:00:37 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:52.725 20:00:37 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:52.725 20:00:37 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:54.105 The operation has completed successfully. 00:04:54.105 20:00:38 -- setup/common.sh@57 -- # (( part++ )) 00:04:54.105 20:00:38 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:54.105 20:00:38 -- setup/common.sh@62 -- # wait 1589110 00:04:54.105 20:00:38 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:54.105 20:00:38 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:54.105 20:00:38 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:54.105 20:00:38 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:54.105 20:00:38 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:54.105 20:00:38 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:54.105 20:00:38 -- setup/devices.sh@161 -- # break 00:04:54.105 20:00:38 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:54.105 20:00:38 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:54.105 20:00:38 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:54.105 20:00:38 -- setup/devices.sh@166 -- # dm=dm-0 00:04:54.105 20:00:38 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:54.105 20:00:38 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:54.105 20:00:38 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:54.105 20:00:38 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:54.105 20:00:38 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:54.105 20:00:38 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:54.105 20:00:38 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:54.105 20:00:38 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:54.105 20:00:38 -- setup/devices.sh@174 -- # verify 0000:1a:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:54.105 20:00:38 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:54.105 20:00:38 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:54.105 20:00:38 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:54.105 20:00:38 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:54.105 20:00:38 -- setup/devices.sh@53 -- # local found=0 00:04:54.105 20:00:38 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:54.105 20:00:38 -- setup/devices.sh@56 -- # : 00:04:54.105 20:00:38 -- setup/devices.sh@59 -- # local pci status 00:04:54.105 20:00:38 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:54.105 20:00:38 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:54.105 20:00:38 -- setup/devices.sh@47 -- # setup output config 00:04:54.105 20:00:38 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.105 20:00:38 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:57.393 20:00:41 -- setup/devices.sh@63 -- # found=1 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.393 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.393 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.394 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.394 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.394 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.394 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.394 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.394 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:57.394 20:00:41 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:04:57.394 20:00:41 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.926 20:00:43 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:59.926 20:00:43 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:59.926 20:00:43 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:59.926 20:00:43 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:59.926 20:00:43 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:59.926 20:00:43 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:59.926 20:00:43 -- setup/devices.sh@184 -- # verify 0000:1a:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:59.926 20:00:43 -- setup/devices.sh@48 -- # local dev=0000:1a:00.0 00:04:59.926 20:00:43 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:59.926 20:00:43 -- setup/devices.sh@50 -- # local mount_point= 00:04:59.926 20:00:43 -- setup/devices.sh@51 -- # local test_file= 00:04:59.926 20:00:43 -- setup/devices.sh@53 -- # local found=0 00:04:59.926 20:00:43 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:59.926 20:00:43 -- setup/devices.sh@59 -- # local pci status 00:04:59.926 20:00:43 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:59.926 20:00:43 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:1a:00.0 00:04:59.926 20:00:43 -- setup/devices.sh@47 -- # setup output config 00:04:59.926 20:00:43 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.926 20:00:43 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:1a:00.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:03.216 20:00:47 -- setup/devices.sh@63 -- # found=1 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:03.216 20:00:47 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\1\a\:\0\0\.\0 ]] 00:05:03.216 20:00:47 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:05.121 20:00:49 -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:05.121 20:00:49 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:05.121 20:00:49 -- setup/devices.sh@68 -- # return 0 00:05:05.121 20:00:49 -- setup/devices.sh@187 -- # cleanup_dm 00:05:05.121 20:00:49 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:05.121 20:00:49 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:05.121 20:00:49 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:05.121 20:00:49 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:05.121 20:00:49 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:05.121 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:05.121 20:00:49 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:05.121 20:00:49 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:05.121 00:05:05.121 real 0m14.290s 00:05:05.121 user 0m3.934s 00:05:05.121 sys 0m7.382s 00:05:05.121 20:00:49 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:05.121 20:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:05.121 ************************************ 00:05:05.121 END TEST dm_mount 00:05:05.121 ************************************ 00:05:05.121 20:00:49 -- setup/devices.sh@1 -- # cleanup 00:05:05.121 20:00:49 -- setup/devices.sh@11 -- # cleanup_nvme 00:05:05.121 20:00:49 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:05:05.121 20:00:49 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:05.121 20:00:49 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:05.121 20:00:49 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:05.121 20:00:49 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:05.380 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:05.380 /dev/nvme0n1: 8 bytes were erased at offset 0x3a3817d5e00 (gpt): 45 46 49 20 50 41 52 54 00:05:05.380 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:05.380 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:05.380 20:00:49 -- setup/devices.sh@12 -- # cleanup_dm 00:05:05.380 20:00:49 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:05:05.380 20:00:49 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:05.380 20:00:49 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:05.380 20:00:49 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:05.380 20:00:49 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:05.380 20:00:49 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:05.380 00:05:05.380 real 0m39.773s 00:05:05.380 user 0m11.773s 00:05:05.380 sys 0m22.598s 00:05:05.380 20:00:49 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:05.380 20:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:05.380 ************************************ 00:05:05.380 END TEST devices 00:05:05.380 ************************************ 00:05:05.380 00:05:05.380 real 2m23.575s 00:05:05.380 user 0m42.370s 00:05:05.380 sys 1m23.574s 00:05:05.380 20:00:49 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:05.380 20:00:49 -- common/autotest_common.sh@10 -- # set +x 00:05:05.380 ************************************ 00:05:05.380 END TEST setup.sh 00:05:05.380 ************************************ 00:05:05.380 20:00:49 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:05:08.671 Hugepages 00:05:08.671 node hugesize free / total 00:05:08.671 node0 1048576kB 0 / 0 00:05:08.944 node0 2048kB 2048 / 2048 00:05:08.944 node1 1048576kB 0 / 0 00:05:08.944 node1 2048kB 0 / 0 00:05:08.944 00:05:08.944 Type BDF Vendor Device NUMA Driver Device Block devices 00:05:08.944 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:05:08.944 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:05:08.944 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:05:08.945 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:05:08.945 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:05:08.945 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:05:08.945 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:05:08.945 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:05:08.945 NVMe 0000:1a:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:05:08.945 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:05:08.945 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:05:08.945 20:00:53 -- spdk/autotest.sh@130 -- # uname -s 00:05:08.945 20:00:53 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:05:08.945 20:00:53 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:05:08.945 20:00:53 -- common/autotest_common.sh@1527 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:13.144 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:13.144 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:16.444 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:18.381 20:01:02 -- common/autotest_common.sh@1528 -- # sleep 1 00:05:19.320 20:01:03 -- common/autotest_common.sh@1529 -- # bdfs=() 00:05:19.320 20:01:03 -- common/autotest_common.sh@1529 -- # local bdfs 00:05:19.320 20:01:03 -- common/autotest_common.sh@1530 -- # bdfs=($(get_nvme_bdfs)) 00:05:19.320 20:01:03 -- common/autotest_common.sh@1530 -- # get_nvme_bdfs 00:05:19.320 20:01:03 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:19.320 20:01:03 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:19.320 20:01:03 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:19.320 20:01:03 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:19.320 20:01:03 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:19.320 20:01:03 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:19.320 20:01:03 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:1a:00.0 00:05:19.320 20:01:03 -- common/autotest_common.sh@1532 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:22.611 Waiting for block devices as requested 00:05:22.611 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:05:22.869 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:22.869 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:22.869 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:23.129 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.129 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.129 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:23.129 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:23.389 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:23.389 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:23.389 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:23.648 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:23.648 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:23.648 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:23.908 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:23.908 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:23.908 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:25.810 20:01:10 -- common/autotest_common.sh@1534 -- # for bdf in "${bdfs[@]}" 00:05:25.810 20:01:10 -- common/autotest_common.sh@1535 -- # get_nvme_ctrlr_from_bdf 0000:1a:00.0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1498 -- # readlink -f /sys/class/nvme/nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1498 -- # grep 0000:1a:00.0/nvme/nvme 00:05:25.810 20:01:10 -- common/autotest_common.sh@1498 -- # bdf_sysfs_path=/sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1499 -- # [[ -z /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 ]] 00:05:25.810 20:01:10 -- common/autotest_common.sh@1503 -- # basename /sys/devices/pci0000:17/0000:17:00.0/0000:18:00.0/0000:19:00.0/0000:1a:00.0/nvme/nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1503 -- # printf '%s\n' nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1535 -- # nvme_ctrlr=/dev/nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1536 -- # [[ -z /dev/nvme0 ]] 00:05:25.810 20:01:10 -- common/autotest_common.sh@1541 -- # nvme id-ctrl /dev/nvme0 00:05:25.810 20:01:10 -- common/autotest_common.sh@1541 -- # grep oacs 00:05:25.810 20:01:10 -- common/autotest_common.sh@1541 -- # cut -d: -f2 00:05:26.069 20:01:10 -- common/autotest_common.sh@1541 -- # oacs=' 0xe' 00:05:26.069 20:01:10 -- common/autotest_common.sh@1542 -- # oacs_ns_manage=8 00:05:26.069 20:01:10 -- common/autotest_common.sh@1544 -- # [[ 8 -ne 0 ]] 00:05:26.069 20:01:10 -- common/autotest_common.sh@1550 -- # grep unvmcap 00:05:26.069 20:01:10 -- common/autotest_common.sh@1550 -- # nvme id-ctrl /dev/nvme0 00:05:26.069 20:01:10 -- common/autotest_common.sh@1550 -- # cut -d: -f2 00:05:26.069 20:01:10 -- common/autotest_common.sh@1550 -- # unvmcap=' 0' 00:05:26.069 20:01:10 -- common/autotest_common.sh@1551 -- # [[ 0 -eq 0 ]] 00:05:26.069 20:01:10 -- common/autotest_common.sh@1553 -- # continue 00:05:26.069 20:01:10 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:05:26.069 20:01:10 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:26.069 20:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:26.069 20:01:10 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:05:26.069 20:01:10 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:26.069 20:01:10 -- common/autotest_common.sh@10 -- # set +x 00:05:26.069 20:01:10 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:30.256 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:30.257 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:33.544 0000:1a:00.0 (8086 0a54): nvme -> vfio-pci 00:05:34.919 20:01:19 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:05:34.919 20:01:19 -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:34.919 20:01:19 -- common/autotest_common.sh@10 -- # set +x 00:05:34.919 20:01:19 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:05:34.919 20:01:19 -- common/autotest_common.sh@1587 -- # mapfile -t bdfs 00:05:34.919 20:01:19 -- common/autotest_common.sh@1587 -- # get_nvme_bdfs_by_id 0x0a54 00:05:34.919 20:01:19 -- common/autotest_common.sh@1573 -- # bdfs=() 00:05:34.919 20:01:19 -- common/autotest_common.sh@1573 -- # local bdfs 00:05:34.919 20:01:19 -- common/autotest_common.sh@1575 -- # get_nvme_bdfs 00:05:34.919 20:01:19 -- common/autotest_common.sh@1509 -- # bdfs=() 00:05:34.919 20:01:19 -- common/autotest_common.sh@1509 -- # local bdfs 00:05:34.919 20:01:19 -- common/autotest_common.sh@1510 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:34.919 20:01:19 -- common/autotest_common.sh@1510 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:34.919 20:01:19 -- common/autotest_common.sh@1510 -- # jq -r '.config[].params.traddr' 00:05:35.178 20:01:19 -- common/autotest_common.sh@1511 -- # (( 1 == 0 )) 00:05:35.178 20:01:19 -- common/autotest_common.sh@1515 -- # printf '%s\n' 0000:1a:00.0 00:05:35.178 20:01:19 -- common/autotest_common.sh@1575 -- # for bdf in $(get_nvme_bdfs) 00:05:35.178 20:01:19 -- common/autotest_common.sh@1576 -- # cat /sys/bus/pci/devices/0000:1a:00.0/device 00:05:35.178 20:01:19 -- common/autotest_common.sh@1576 -- # device=0x0a54 00:05:35.178 20:01:19 -- common/autotest_common.sh@1577 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:35.178 20:01:19 -- common/autotest_common.sh@1578 -- # bdfs+=($bdf) 00:05:35.178 20:01:19 -- common/autotest_common.sh@1582 -- # printf '%s\n' 0000:1a:00.0 00:05:35.178 20:01:19 -- common/autotest_common.sh@1588 -- # [[ -z 0000:1a:00.0 ]] 00:05:35.178 20:01:19 -- common/autotest_common.sh@1593 -- # spdk_tgt_pid=1599957 00:05:35.178 20:01:19 -- common/autotest_common.sh@1594 -- # waitforlisten 1599957 00:05:35.178 20:01:19 -- common/autotest_common.sh@1592 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:35.178 20:01:19 -- common/autotest_common.sh@827 -- # '[' -z 1599957 ']' 00:05:35.178 20:01:19 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:35.178 20:01:19 -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:35.178 20:01:19 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:35.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:35.178 20:01:19 -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:35.178 20:01:19 -- common/autotest_common.sh@10 -- # set +x 00:05:35.178 [2024-04-26 20:01:19.469120] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:05:35.178 [2024-04-26 20:01:19.469196] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1599957 ] 00:05:35.178 EAL: No free 2048 kB hugepages reported on node 1 00:05:35.178 [2024-04-26 20:01:19.555467] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:35.436 [2024-04-26 20:01:19.641096] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:36.005 20:01:20 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:36.005 20:01:20 -- common/autotest_common.sh@860 -- # return 0 00:05:36.005 20:01:20 -- common/autotest_common.sh@1596 -- # bdf_id=0 00:05:36.005 20:01:20 -- common/autotest_common.sh@1597 -- # for bdf in "${bdfs[@]}" 00:05:36.005 20:01:20 -- common/autotest_common.sh@1598 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:1a:00.0 00:05:39.301 nvme0n1 00:05:39.301 20:01:23 -- common/autotest_common.sh@1600 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:39.301 [2024-04-26 20:01:23.483556] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:39.301 request: 00:05:39.301 { 00:05:39.301 "nvme_ctrlr_name": "nvme0", 00:05:39.301 "password": "test", 00:05:39.301 "method": "bdev_nvme_opal_revert", 00:05:39.301 "req_id": 1 00:05:39.301 } 00:05:39.301 Got JSON-RPC error response 00:05:39.301 response: 00:05:39.301 { 00:05:39.301 "code": -32602, 00:05:39.301 "message": "Invalid parameters" 00:05:39.301 } 00:05:39.301 20:01:23 -- common/autotest_common.sh@1600 -- # true 00:05:39.301 20:01:23 -- common/autotest_common.sh@1601 -- # (( ++bdf_id )) 00:05:39.301 20:01:23 -- common/autotest_common.sh@1604 -- # killprocess 1599957 00:05:39.301 20:01:23 -- common/autotest_common.sh@946 -- # '[' -z 1599957 ']' 00:05:39.301 20:01:23 -- common/autotest_common.sh@950 -- # kill -0 1599957 00:05:39.301 20:01:23 -- common/autotest_common.sh@951 -- # uname 00:05:39.301 20:01:23 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:39.301 20:01:23 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1599957 00:05:39.301 20:01:23 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:39.301 20:01:23 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:39.301 20:01:23 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1599957' 00:05:39.301 killing process with pid 1599957 00:05:39.301 20:01:23 -- common/autotest_common.sh@965 -- # kill 1599957 00:05:39.301 20:01:23 -- common/autotest_common.sh@970 -- # wait 1599957 00:05:43.501 20:01:27 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:43.501 20:01:27 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:43.501 20:01:27 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:43.501 20:01:27 -- spdk/autotest.sh@155 -- # [[ 0 -eq 1 ]] 00:05:43.501 20:01:27 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:43.501 20:01:27 -- common/autotest_common.sh@720 -- # xtrace_disable 00:05:43.501 20:01:27 -- common/autotest_common.sh@10 -- # set +x 00:05:43.501 20:01:27 -- spdk/autotest.sh@164 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:43.501 20:01:27 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:43.501 20:01:27 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:43.501 20:01:27 -- common/autotest_common.sh@10 -- # set +x 00:05:43.501 ************************************ 00:05:43.501 START TEST env 00:05:43.501 ************************************ 00:05:43.501 20:01:27 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:43.501 * Looking for test storage... 00:05:43.501 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:43.501 20:01:27 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:43.501 20:01:27 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:43.501 20:01:27 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:43.501 20:01:27 -- common/autotest_common.sh@10 -- # set +x 00:05:43.501 ************************************ 00:05:43.501 START TEST env_memory 00:05:43.501 ************************************ 00:05:43.501 20:01:27 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:43.501 00:05:43.501 00:05:43.501 CUnit - A unit testing framework for C - Version 2.1-3 00:05:43.501 http://cunit.sourceforge.net/ 00:05:43.501 00:05:43.501 00:05:43.501 Suite: memory 00:05:43.761 Test: alloc and free memory map ...[2024-04-26 20:01:27.961317] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:43.761 passed 00:05:43.761 Test: mem map translation ...[2024-04-26 20:01:27.974762] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:43.761 [2024-04-26 20:01:27.974780] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:43.761 [2024-04-26 20:01:27.974813] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:43.761 [2024-04-26 20:01:27.974822] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:43.761 passed 00:05:43.761 Test: mem map registration ...[2024-04-26 20:01:27.997276] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:43.761 [2024-04-26 20:01:27.997294] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:43.761 passed 00:05:43.761 Test: mem map adjacent registrations ...passed 00:05:43.761 00:05:43.761 Run Summary: Type Total Ran Passed Failed Inactive 00:05:43.761 suites 1 1 n/a 0 0 00:05:43.761 tests 4 4 4 0 0 00:05:43.761 asserts 152 152 152 0 n/a 00:05:43.761 00:05:43.761 Elapsed time = 0.088 seconds 00:05:43.761 00:05:43.761 real 0m0.101s 00:05:43.761 user 0m0.087s 00:05:43.761 sys 0m0.013s 00:05:43.761 20:01:28 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:43.761 20:01:28 -- common/autotest_common.sh@10 -- # set +x 00:05:43.761 ************************************ 00:05:43.761 END TEST env_memory 00:05:43.761 ************************************ 00:05:43.761 20:01:28 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:43.761 20:01:28 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:43.761 20:01:28 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:43.761 20:01:28 -- common/autotest_common.sh@10 -- # set +x 00:05:44.021 ************************************ 00:05:44.021 START TEST env_vtophys 00:05:44.021 ************************************ 00:05:44.021 20:01:28 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:44.021 EAL: lib.eal log level changed from notice to debug 00:05:44.021 EAL: Detected lcore 0 as core 0 on socket 0 00:05:44.021 EAL: Detected lcore 1 as core 1 on socket 0 00:05:44.021 EAL: Detected lcore 2 as core 2 on socket 0 00:05:44.021 EAL: Detected lcore 3 as core 3 on socket 0 00:05:44.021 EAL: Detected lcore 4 as core 4 on socket 0 00:05:44.021 EAL: Detected lcore 5 as core 8 on socket 0 00:05:44.021 EAL: Detected lcore 6 as core 9 on socket 0 00:05:44.021 EAL: Detected lcore 7 as core 10 on socket 0 00:05:44.021 EAL: Detected lcore 8 as core 11 on socket 0 00:05:44.021 EAL: Detected lcore 9 as core 16 on socket 0 00:05:44.021 EAL: Detected lcore 10 as core 17 on socket 0 00:05:44.021 EAL: Detected lcore 11 as core 18 on socket 0 00:05:44.021 EAL: Detected lcore 12 as core 19 on socket 0 00:05:44.021 EAL: Detected lcore 13 as core 20 on socket 0 00:05:44.021 EAL: Detected lcore 14 as core 24 on socket 0 00:05:44.021 EAL: Detected lcore 15 as core 25 on socket 0 00:05:44.021 EAL: Detected lcore 16 as core 26 on socket 0 00:05:44.021 EAL: Detected lcore 17 as core 27 on socket 0 00:05:44.021 EAL: Detected lcore 18 as core 0 on socket 1 00:05:44.021 EAL: Detected lcore 19 as core 1 on socket 1 00:05:44.021 EAL: Detected lcore 20 as core 2 on socket 1 00:05:44.021 EAL: Detected lcore 21 as core 3 on socket 1 00:05:44.021 EAL: Detected lcore 22 as core 4 on socket 1 00:05:44.021 EAL: Detected lcore 23 as core 8 on socket 1 00:05:44.021 EAL: Detected lcore 24 as core 9 on socket 1 00:05:44.021 EAL: Detected lcore 25 as core 10 on socket 1 00:05:44.021 EAL: Detected lcore 26 as core 11 on socket 1 00:05:44.021 EAL: Detected lcore 27 as core 16 on socket 1 00:05:44.021 EAL: Detected lcore 28 as core 17 on socket 1 00:05:44.021 EAL: Detected lcore 29 as core 18 on socket 1 00:05:44.021 EAL: Detected lcore 30 as core 19 on socket 1 00:05:44.021 EAL: Detected lcore 31 as core 20 on socket 1 00:05:44.021 EAL: Detected lcore 32 as core 24 on socket 1 00:05:44.021 EAL: Detected lcore 33 as core 25 on socket 1 00:05:44.021 EAL: Detected lcore 34 as core 26 on socket 1 00:05:44.021 EAL: Detected lcore 35 as core 27 on socket 1 00:05:44.021 EAL: Detected lcore 36 as core 0 on socket 0 00:05:44.021 EAL: Detected lcore 37 as core 1 on socket 0 00:05:44.021 EAL: Detected lcore 38 as core 2 on socket 0 00:05:44.021 EAL: Detected lcore 39 as core 3 on socket 0 00:05:44.021 EAL: Detected lcore 40 as core 4 on socket 0 00:05:44.021 EAL: Detected lcore 41 as core 8 on socket 0 00:05:44.021 EAL: Detected lcore 42 as core 9 on socket 0 00:05:44.021 EAL: Detected lcore 43 as core 10 on socket 0 00:05:44.021 EAL: Detected lcore 44 as core 11 on socket 0 00:05:44.021 EAL: Detected lcore 45 as core 16 on socket 0 00:05:44.021 EAL: Detected lcore 46 as core 17 on socket 0 00:05:44.021 EAL: Detected lcore 47 as core 18 on socket 0 00:05:44.021 EAL: Detected lcore 48 as core 19 on socket 0 00:05:44.021 EAL: Detected lcore 49 as core 20 on socket 0 00:05:44.021 EAL: Detected lcore 50 as core 24 on socket 0 00:05:44.021 EAL: Detected lcore 51 as core 25 on socket 0 00:05:44.021 EAL: Detected lcore 52 as core 26 on socket 0 00:05:44.021 EAL: Detected lcore 53 as core 27 on socket 0 00:05:44.021 EAL: Detected lcore 54 as core 0 on socket 1 00:05:44.021 EAL: Detected lcore 55 as core 1 on socket 1 00:05:44.021 EAL: Detected lcore 56 as core 2 on socket 1 00:05:44.021 EAL: Detected lcore 57 as core 3 on socket 1 00:05:44.021 EAL: Detected lcore 58 as core 4 on socket 1 00:05:44.021 EAL: Detected lcore 59 as core 8 on socket 1 00:05:44.021 EAL: Detected lcore 60 as core 9 on socket 1 00:05:44.021 EAL: Detected lcore 61 as core 10 on socket 1 00:05:44.021 EAL: Detected lcore 62 as core 11 on socket 1 00:05:44.021 EAL: Detected lcore 63 as core 16 on socket 1 00:05:44.021 EAL: Detected lcore 64 as core 17 on socket 1 00:05:44.021 EAL: Detected lcore 65 as core 18 on socket 1 00:05:44.021 EAL: Detected lcore 66 as core 19 on socket 1 00:05:44.021 EAL: Detected lcore 67 as core 20 on socket 1 00:05:44.021 EAL: Detected lcore 68 as core 24 on socket 1 00:05:44.021 EAL: Detected lcore 69 as core 25 on socket 1 00:05:44.021 EAL: Detected lcore 70 as core 26 on socket 1 00:05:44.021 EAL: Detected lcore 71 as core 27 on socket 1 00:05:44.021 EAL: Maximum logical cores by configuration: 128 00:05:44.021 EAL: Detected CPU lcores: 72 00:05:44.021 EAL: Detected NUMA nodes: 2 00:05:44.021 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:44.021 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:44.021 EAL: Checking presence of .so 'librte_eal.so' 00:05:44.021 EAL: Detected static linkage of DPDK 00:05:44.021 EAL: No shared files mode enabled, IPC will be disabled 00:05:44.021 EAL: Bus pci wants IOVA as 'DC' 00:05:44.021 EAL: Buses did not request a specific IOVA mode. 00:05:44.021 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:44.022 EAL: Selected IOVA mode 'VA' 00:05:44.022 EAL: No free 2048 kB hugepages reported on node 1 00:05:44.022 EAL: Probing VFIO support... 00:05:44.022 EAL: IOMMU type 1 (Type 1) is supported 00:05:44.022 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:44.022 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:44.022 EAL: VFIO support initialized 00:05:44.022 EAL: Ask a virtual area of 0x2e000 bytes 00:05:44.022 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:44.022 EAL: Setting up physically contiguous memory... 00:05:44.022 EAL: Setting maximum number of open files to 524288 00:05:44.022 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:44.022 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:44.022 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:44.022 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:44.022 EAL: Ask a virtual area of 0x61000 bytes 00:05:44.022 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:44.022 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:44.022 EAL: Ask a virtual area of 0x400000000 bytes 00:05:44.022 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:44.022 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:44.022 EAL: Hugepages will be freed exactly as allocated. 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: TSC frequency is ~2300000 KHz 00:05:44.022 EAL: Main lcore 0 is ready (tid=7f4344480a00;cpuset=[0]) 00:05:44.022 EAL: Trying to obtain current memory policy. 00:05:44.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.022 EAL: Restoring previous memory policy: 0 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was expanded by 2MB 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Mem event callback 'spdk:(nil)' registered 00:05:44.022 00:05:44.022 00:05:44.022 CUnit - A unit testing framework for C - Version 2.1-3 00:05:44.022 http://cunit.sourceforge.net/ 00:05:44.022 00:05:44.022 00:05:44.022 Suite: components_suite 00:05:44.022 Test: vtophys_malloc_test ...passed 00:05:44.022 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:44.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.022 EAL: Restoring previous memory policy: 4 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was expanded by 4MB 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was shrunk by 4MB 00:05:44.022 EAL: Trying to obtain current memory policy. 00:05:44.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.022 EAL: Restoring previous memory policy: 4 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was expanded by 6MB 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was shrunk by 6MB 00:05:44.022 EAL: Trying to obtain current memory policy. 00:05:44.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.022 EAL: Restoring previous memory policy: 4 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was expanded by 10MB 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was shrunk by 10MB 00:05:44.022 EAL: Trying to obtain current memory policy. 00:05:44.022 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.022 EAL: Restoring previous memory policy: 4 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was expanded by 18MB 00:05:44.022 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.022 EAL: request: mp_malloc_sync 00:05:44.022 EAL: No shared files mode enabled, IPC is disabled 00:05:44.022 EAL: Heap on socket 0 was shrunk by 18MB 00:05:44.023 EAL: Trying to obtain current memory policy. 00:05:44.023 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.023 EAL: Restoring previous memory policy: 4 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.023 EAL: request: mp_malloc_sync 00:05:44.023 EAL: No shared files mode enabled, IPC is disabled 00:05:44.023 EAL: Heap on socket 0 was expanded by 34MB 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.023 EAL: request: mp_malloc_sync 00:05:44.023 EAL: No shared files mode enabled, IPC is disabled 00:05:44.023 EAL: Heap on socket 0 was shrunk by 34MB 00:05:44.023 EAL: Trying to obtain current memory policy. 00:05:44.023 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.023 EAL: Restoring previous memory policy: 4 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.023 EAL: request: mp_malloc_sync 00:05:44.023 EAL: No shared files mode enabled, IPC is disabled 00:05:44.023 EAL: Heap on socket 0 was expanded by 66MB 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.023 EAL: request: mp_malloc_sync 00:05:44.023 EAL: No shared files mode enabled, IPC is disabled 00:05:44.023 EAL: Heap on socket 0 was shrunk by 66MB 00:05:44.023 EAL: Trying to obtain current memory policy. 00:05:44.023 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.023 EAL: Restoring previous memory policy: 4 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.023 EAL: request: mp_malloc_sync 00:05:44.023 EAL: No shared files mode enabled, IPC is disabled 00:05:44.023 EAL: Heap on socket 0 was expanded by 130MB 00:05:44.023 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.282 EAL: request: mp_malloc_sync 00:05:44.282 EAL: No shared files mode enabled, IPC is disabled 00:05:44.282 EAL: Heap on socket 0 was shrunk by 130MB 00:05:44.282 EAL: Trying to obtain current memory policy. 00:05:44.282 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.282 EAL: Restoring previous memory policy: 4 00:05:44.282 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.282 EAL: request: mp_malloc_sync 00:05:44.282 EAL: No shared files mode enabled, IPC is disabled 00:05:44.282 EAL: Heap on socket 0 was expanded by 258MB 00:05:44.282 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.282 EAL: request: mp_malloc_sync 00:05:44.282 EAL: No shared files mode enabled, IPC is disabled 00:05:44.282 EAL: Heap on socket 0 was shrunk by 258MB 00:05:44.282 EAL: Trying to obtain current memory policy. 00:05:44.282 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.541 EAL: Restoring previous memory policy: 4 00:05:44.541 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.541 EAL: request: mp_malloc_sync 00:05:44.541 EAL: No shared files mode enabled, IPC is disabled 00:05:44.541 EAL: Heap on socket 0 was expanded by 514MB 00:05:44.541 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.541 EAL: request: mp_malloc_sync 00:05:44.541 EAL: No shared files mode enabled, IPC is disabled 00:05:44.541 EAL: Heap on socket 0 was shrunk by 514MB 00:05:44.541 EAL: Trying to obtain current memory policy. 00:05:44.541 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:44.800 EAL: Restoring previous memory policy: 4 00:05:44.800 EAL: Calling mem event callback 'spdk:(nil)' 00:05:44.800 EAL: request: mp_malloc_sync 00:05:44.800 EAL: No shared files mode enabled, IPC is disabled 00:05:44.800 EAL: Heap on socket 0 was expanded by 1026MB 00:05:45.071 EAL: Calling mem event callback 'spdk:(nil)' 00:05:45.071 EAL: request: mp_malloc_sync 00:05:45.071 EAL: No shared files mode enabled, IPC is disabled 00:05:45.071 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:45.071 passed 00:05:45.071 00:05:45.071 Run Summary: Type Total Ran Passed Failed Inactive 00:05:45.071 suites 1 1 n/a 0 0 00:05:45.071 tests 2 2 2 0 0 00:05:45.071 asserts 497 497 497 0 n/a 00:05:45.071 00:05:45.071 Elapsed time = 1.109 seconds 00:05:45.071 EAL: Calling mem event callback 'spdk:(nil)' 00:05:45.071 EAL: request: mp_malloc_sync 00:05:45.071 EAL: No shared files mode enabled, IPC is disabled 00:05:45.071 EAL: Heap on socket 0 was shrunk by 2MB 00:05:45.071 EAL: No shared files mode enabled, IPC is disabled 00:05:45.071 EAL: No shared files mode enabled, IPC is disabled 00:05:45.071 EAL: No shared files mode enabled, IPC is disabled 00:05:45.071 00:05:45.071 real 0m1.250s 00:05:45.071 user 0m0.729s 00:05:45.071 sys 0m0.492s 00:05:45.071 20:01:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:45.071 20:01:29 -- common/autotest_common.sh@10 -- # set +x 00:05:45.071 ************************************ 00:05:45.071 END TEST env_vtophys 00:05:45.071 ************************************ 00:05:45.332 20:01:29 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:45.332 20:01:29 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:45.332 20:01:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:45.332 20:01:29 -- common/autotest_common.sh@10 -- # set +x 00:05:45.332 ************************************ 00:05:45.332 START TEST env_pci 00:05:45.332 ************************************ 00:05:45.332 20:01:29 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:45.332 00:05:45.332 00:05:45.332 CUnit - A unit testing framework for C - Version 2.1-3 00:05:45.332 http://cunit.sourceforge.net/ 00:05:45.332 00:05:45.332 00:05:45.332 Suite: pci 00:05:45.332 Test: pci_hook ...[2024-04-26 20:01:29.681763] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1601404 has claimed it 00:05:45.332 EAL: Cannot find device (10000:00:01.0) 00:05:45.332 EAL: Failed to attach device on primary process 00:05:45.332 passed 00:05:45.332 00:05:45.332 Run Summary: Type Total Ran Passed Failed Inactive 00:05:45.332 suites 1 1 n/a 0 0 00:05:45.332 tests 1 1 1 0 0 00:05:45.332 asserts 25 25 25 0 n/a 00:05:45.332 00:05:45.332 Elapsed time = 0.038 seconds 00:05:45.332 00:05:45.332 real 0m0.056s 00:05:45.332 user 0m0.008s 00:05:45.332 sys 0m0.048s 00:05:45.332 20:01:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:45.332 20:01:29 -- common/autotest_common.sh@10 -- # set +x 00:05:45.332 ************************************ 00:05:45.332 END TEST env_pci 00:05:45.332 ************************************ 00:05:45.332 20:01:29 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:45.332 20:01:29 -- env/env.sh@15 -- # uname 00:05:45.332 20:01:29 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:45.332 20:01:29 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:45.332 20:01:29 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:45.332 20:01:29 -- common/autotest_common.sh@1097 -- # '[' 5 -le 1 ']' 00:05:45.332 20:01:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:45.332 20:01:29 -- common/autotest_common.sh@10 -- # set +x 00:05:45.601 ************************************ 00:05:45.601 START TEST env_dpdk_post_init 00:05:45.601 ************************************ 00:05:45.602 20:01:29 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:45.602 EAL: Detected CPU lcores: 72 00:05:45.602 EAL: Detected NUMA nodes: 2 00:05:45.602 EAL: Detected static linkage of DPDK 00:05:45.602 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:45.602 EAL: Selected IOVA mode 'VA' 00:05:45.602 EAL: No free 2048 kB hugepages reported on node 1 00:05:45.602 EAL: VFIO support initialized 00:05:45.602 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:45.863 EAL: Using IOMMU type 1 (Type 1) 00:05:46.433 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:1a:00.0 (socket 0) 00:05:51.732 EAL: Releasing PCI mapped resource for 0000:1a:00.0 00:05:51.732 EAL: Calling pci_unmap_resource for 0000:1a:00.0 at 0x202001000000 00:05:51.990 Starting DPDK initialization... 00:05:51.990 Starting SPDK post initialization... 00:05:51.990 SPDK NVMe probe 00:05:51.990 Attaching to 0000:1a:00.0 00:05:51.990 Attached to 0000:1a:00.0 00:05:51.990 Cleaning up... 00:05:51.990 00:05:51.990 real 0m6.474s 00:05:51.990 user 0m4.946s 00:05:51.990 sys 0m0.780s 00:05:51.990 20:01:36 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:51.990 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:51.990 ************************************ 00:05:51.990 END TEST env_dpdk_post_init 00:05:51.990 ************************************ 00:05:51.990 20:01:36 -- env/env.sh@26 -- # uname 00:05:51.990 20:01:36 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:51.990 20:01:36 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:51.990 20:01:36 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:51.990 20:01:36 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:51.990 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.249 ************************************ 00:05:52.249 START TEST env_mem_callbacks 00:05:52.249 ************************************ 00:05:52.249 20:01:36 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:52.249 EAL: Detected CPU lcores: 72 00:05:52.249 EAL: Detected NUMA nodes: 2 00:05:52.249 EAL: Detected static linkage of DPDK 00:05:52.249 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:52.249 EAL: Selected IOVA mode 'VA' 00:05:52.249 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.249 EAL: VFIO support initialized 00:05:52.249 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:52.249 00:05:52.249 00:05:52.249 CUnit - A unit testing framework for C - Version 2.1-3 00:05:52.249 http://cunit.sourceforge.net/ 00:05:52.249 00:05:52.249 00:05:52.249 Suite: memory 00:05:52.249 Test: test ... 00:05:52.249 register 0x200000200000 2097152 00:05:52.249 malloc 3145728 00:05:52.249 register 0x200000400000 4194304 00:05:52.249 buf 0x200000500000 len 3145728 PASSED 00:05:52.249 malloc 64 00:05:52.249 buf 0x2000004fff40 len 64 PASSED 00:05:52.249 malloc 4194304 00:05:52.249 register 0x200000800000 6291456 00:05:52.249 buf 0x200000a00000 len 4194304 PASSED 00:05:52.249 free 0x200000500000 3145728 00:05:52.249 free 0x2000004fff40 64 00:05:52.249 unregister 0x200000400000 4194304 PASSED 00:05:52.249 free 0x200000a00000 4194304 00:05:52.249 unregister 0x200000800000 6291456 PASSED 00:05:52.249 malloc 8388608 00:05:52.249 register 0x200000400000 10485760 00:05:52.249 buf 0x200000600000 len 8388608 PASSED 00:05:52.249 free 0x200000600000 8388608 00:05:52.249 unregister 0x200000400000 10485760 PASSED 00:05:52.249 passed 00:05:52.249 00:05:52.249 Run Summary: Type Total Ran Passed Failed Inactive 00:05:52.249 suites 1 1 n/a 0 0 00:05:52.249 tests 1 1 1 0 0 00:05:52.249 asserts 15 15 15 0 n/a 00:05:52.249 00:05:52.249 Elapsed time = 0.006 seconds 00:05:52.249 00:05:52.249 real 0m0.057s 00:05:52.249 user 0m0.010s 00:05:52.249 sys 0m0.047s 00:05:52.249 20:01:36 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:52.249 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.249 ************************************ 00:05:52.249 END TEST env_mem_callbacks 00:05:52.249 ************************************ 00:05:52.249 00:05:52.249 real 0m9.020s 00:05:52.249 user 0m6.155s 00:05:52.249 sys 0m2.028s 00:05:52.249 20:01:36 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:52.249 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.249 ************************************ 00:05:52.249 END TEST env 00:05:52.249 ************************************ 00:05:52.508 20:01:36 -- spdk/autotest.sh@165 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:52.508 20:01:36 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:52.508 20:01:36 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:52.508 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.508 ************************************ 00:05:52.508 START TEST rpc 00:05:52.508 ************************************ 00:05:52.508 20:01:36 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:52.766 * Looking for test storage... 00:05:52.766 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:52.766 20:01:36 -- rpc/rpc.sh@65 -- # spdk_pid=1602483 00:05:52.766 20:01:36 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:52.766 20:01:36 -- rpc/rpc.sh@67 -- # waitforlisten 1602483 00:05:52.766 20:01:36 -- common/autotest_common.sh@827 -- # '[' -z 1602483 ']' 00:05:52.766 20:01:36 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:52.766 20:01:36 -- common/autotest_common.sh@832 -- # local max_retries=100 00:05:52.766 20:01:36 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:52.766 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:52.766 20:01:36 -- common/autotest_common.sh@836 -- # xtrace_disable 00:05:52.766 20:01:36 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:52.766 20:01:36 -- common/autotest_common.sh@10 -- # set +x 00:05:52.766 [2024-04-26 20:01:37.021722] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:05:52.766 [2024-04-26 20:01:37.021824] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1602483 ] 00:05:52.766 EAL: No free 2048 kB hugepages reported on node 1 00:05:52.766 [2024-04-26 20:01:37.106657] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.766 [2024-04-26 20:01:37.192934] app.c: 523:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:52.766 [2024-04-26 20:01:37.192971] app.c: 527:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1602483' to capture a snapshot of events at runtime. 00:05:52.766 [2024-04-26 20:01:37.192980] app.c: 529:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:52.766 [2024-04-26 20:01:37.192989] app.c: 530:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:52.766 [2024-04-26 20:01:37.192996] app.c: 531:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1602483 for offline analysis/debug. 00:05:52.766 [2024-04-26 20:01:37.193020] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.701 20:01:37 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:05:53.701 20:01:37 -- common/autotest_common.sh@860 -- # return 0 00:05:53.701 20:01:37 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:53.701 20:01:37 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:53.701 20:01:37 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:53.701 20:01:37 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:53.701 20:01:37 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:53.701 20:01:37 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:53.701 20:01:37 -- common/autotest_common.sh@10 -- # set +x 00:05:53.701 ************************************ 00:05:53.701 START TEST rpc_integrity 00:05:53.701 ************************************ 00:05:53.701 20:01:37 -- common/autotest_common.sh@1121 -- # rpc_integrity 00:05:53.701 20:01:37 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:53.701 20:01:37 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.701 20:01:37 -- common/autotest_common.sh@10 -- # set +x 00:05:53.701 20:01:37 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.701 20:01:37 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:53.701 20:01:37 -- rpc/rpc.sh@13 -- # jq length 00:05:53.701 20:01:38 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:53.701 20:01:38 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:53.701 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.701 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.701 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.701 20:01:38 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:53.701 20:01:38 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:53.701 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.701 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.701 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.701 20:01:38 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:53.701 { 00:05:53.701 "name": "Malloc0", 00:05:53.701 "aliases": [ 00:05:53.701 "f49972f8-a167-4b0e-b72d-d8f21b23248d" 00:05:53.701 ], 00:05:53.701 "product_name": "Malloc disk", 00:05:53.701 "block_size": 512, 00:05:53.701 "num_blocks": 16384, 00:05:53.701 "uuid": "f49972f8-a167-4b0e-b72d-d8f21b23248d", 00:05:53.701 "assigned_rate_limits": { 00:05:53.701 "rw_ios_per_sec": 0, 00:05:53.701 "rw_mbytes_per_sec": 0, 00:05:53.701 "r_mbytes_per_sec": 0, 00:05:53.701 "w_mbytes_per_sec": 0 00:05:53.701 }, 00:05:53.701 "claimed": false, 00:05:53.701 "zoned": false, 00:05:53.701 "supported_io_types": { 00:05:53.701 "read": true, 00:05:53.701 "write": true, 00:05:53.701 "unmap": true, 00:05:53.701 "write_zeroes": true, 00:05:53.701 "flush": true, 00:05:53.701 "reset": true, 00:05:53.701 "compare": false, 00:05:53.701 "compare_and_write": false, 00:05:53.701 "abort": true, 00:05:53.701 "nvme_admin": false, 00:05:53.701 "nvme_io": false 00:05:53.701 }, 00:05:53.701 "memory_domains": [ 00:05:53.701 { 00:05:53.701 "dma_device_id": "system", 00:05:53.701 "dma_device_type": 1 00:05:53.701 }, 00:05:53.701 { 00:05:53.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:53.701 "dma_device_type": 2 00:05:53.701 } 00:05:53.701 ], 00:05:53.701 "driver_specific": {} 00:05:53.701 } 00:05:53.701 ]' 00:05:53.701 20:01:38 -- rpc/rpc.sh@17 -- # jq length 00:05:53.701 20:01:38 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:53.702 20:01:38 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:53.702 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.702 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.702 [2024-04-26 20:01:38.109526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:53.702 [2024-04-26 20:01:38.109559] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:53.702 [2024-04-26 20:01:38.109582] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5067980 00:05:53.702 [2024-04-26 20:01:38.109593] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:53.702 [2024-04-26 20:01:38.110425] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:53.702 [2024-04-26 20:01:38.110450] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:53.702 Passthru0 00:05:53.702 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.702 20:01:38 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:53.702 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.702 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.702 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.702 20:01:38 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:53.702 { 00:05:53.702 "name": "Malloc0", 00:05:53.702 "aliases": [ 00:05:53.702 "f49972f8-a167-4b0e-b72d-d8f21b23248d" 00:05:53.702 ], 00:05:53.702 "product_name": "Malloc disk", 00:05:53.702 "block_size": 512, 00:05:53.702 "num_blocks": 16384, 00:05:53.702 "uuid": "f49972f8-a167-4b0e-b72d-d8f21b23248d", 00:05:53.702 "assigned_rate_limits": { 00:05:53.702 "rw_ios_per_sec": 0, 00:05:53.702 "rw_mbytes_per_sec": 0, 00:05:53.702 "r_mbytes_per_sec": 0, 00:05:53.702 "w_mbytes_per_sec": 0 00:05:53.702 }, 00:05:53.702 "claimed": true, 00:05:53.702 "claim_type": "exclusive_write", 00:05:53.702 "zoned": false, 00:05:53.702 "supported_io_types": { 00:05:53.702 "read": true, 00:05:53.702 "write": true, 00:05:53.702 "unmap": true, 00:05:53.702 "write_zeroes": true, 00:05:53.702 "flush": true, 00:05:53.702 "reset": true, 00:05:53.702 "compare": false, 00:05:53.702 "compare_and_write": false, 00:05:53.702 "abort": true, 00:05:53.702 "nvme_admin": false, 00:05:53.702 "nvme_io": false 00:05:53.702 }, 00:05:53.702 "memory_domains": [ 00:05:53.702 { 00:05:53.702 "dma_device_id": "system", 00:05:53.702 "dma_device_type": 1 00:05:53.702 }, 00:05:53.702 { 00:05:53.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:53.702 "dma_device_type": 2 00:05:53.702 } 00:05:53.702 ], 00:05:53.702 "driver_specific": {} 00:05:53.702 }, 00:05:53.702 { 00:05:53.702 "name": "Passthru0", 00:05:53.702 "aliases": [ 00:05:53.702 "d78460e6-a6ed-58d4-befe-53f2a77acefb" 00:05:53.702 ], 00:05:53.702 "product_name": "passthru", 00:05:53.702 "block_size": 512, 00:05:53.702 "num_blocks": 16384, 00:05:53.702 "uuid": "d78460e6-a6ed-58d4-befe-53f2a77acefb", 00:05:53.702 "assigned_rate_limits": { 00:05:53.702 "rw_ios_per_sec": 0, 00:05:53.702 "rw_mbytes_per_sec": 0, 00:05:53.702 "r_mbytes_per_sec": 0, 00:05:53.702 "w_mbytes_per_sec": 0 00:05:53.702 }, 00:05:53.702 "claimed": false, 00:05:53.702 "zoned": false, 00:05:53.702 "supported_io_types": { 00:05:53.702 "read": true, 00:05:53.702 "write": true, 00:05:53.702 "unmap": true, 00:05:53.702 "write_zeroes": true, 00:05:53.702 "flush": true, 00:05:53.702 "reset": true, 00:05:53.702 "compare": false, 00:05:53.702 "compare_and_write": false, 00:05:53.702 "abort": true, 00:05:53.702 "nvme_admin": false, 00:05:53.702 "nvme_io": false 00:05:53.702 }, 00:05:53.702 "memory_domains": [ 00:05:53.702 { 00:05:53.702 "dma_device_id": "system", 00:05:53.702 "dma_device_type": 1 00:05:53.702 }, 00:05:53.702 { 00:05:53.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:53.702 "dma_device_type": 2 00:05:53.702 } 00:05:53.702 ], 00:05:53.702 "driver_specific": { 00:05:53.702 "passthru": { 00:05:53.702 "name": "Passthru0", 00:05:53.702 "base_bdev_name": "Malloc0" 00:05:53.702 } 00:05:53.702 } 00:05:53.702 } 00:05:53.702 ]' 00:05:53.702 20:01:38 -- rpc/rpc.sh@21 -- # jq length 00:05:53.960 20:01:38 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:53.960 20:01:38 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:53.960 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.960 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.960 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.960 20:01:38 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:53.960 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.960 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.961 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.961 20:01:38 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:53.961 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:53.961 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.961 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:53.961 20:01:38 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:53.961 20:01:38 -- rpc/rpc.sh@26 -- # jq length 00:05:53.961 20:01:38 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:53.961 00:05:53.961 real 0m0.261s 00:05:53.961 user 0m0.161s 00:05:53.961 sys 0m0.036s 00:05:53.961 20:01:38 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:53.961 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:53.961 ************************************ 00:05:53.961 END TEST rpc_integrity 00:05:53.961 ************************************ 00:05:53.961 20:01:38 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:53.961 20:01:38 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:53.961 20:01:38 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:53.961 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 ************************************ 00:05:54.219 START TEST rpc_plugins 00:05:54.219 ************************************ 00:05:54.219 20:01:38 -- common/autotest_common.sh@1121 -- # rpc_plugins 00:05:54.219 20:01:38 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:54.219 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.219 20:01:38 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:54.219 20:01:38 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:54.219 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.219 20:01:38 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:54.219 { 00:05:54.219 "name": "Malloc1", 00:05:54.219 "aliases": [ 00:05:54.219 "468001e5-916c-4a33-be9b-b467346737de" 00:05:54.219 ], 00:05:54.219 "product_name": "Malloc disk", 00:05:54.219 "block_size": 4096, 00:05:54.219 "num_blocks": 256, 00:05:54.219 "uuid": "468001e5-916c-4a33-be9b-b467346737de", 00:05:54.219 "assigned_rate_limits": { 00:05:54.219 "rw_ios_per_sec": 0, 00:05:54.219 "rw_mbytes_per_sec": 0, 00:05:54.219 "r_mbytes_per_sec": 0, 00:05:54.219 "w_mbytes_per_sec": 0 00:05:54.219 }, 00:05:54.219 "claimed": false, 00:05:54.219 "zoned": false, 00:05:54.219 "supported_io_types": { 00:05:54.219 "read": true, 00:05:54.219 "write": true, 00:05:54.219 "unmap": true, 00:05:54.219 "write_zeroes": true, 00:05:54.219 "flush": true, 00:05:54.219 "reset": true, 00:05:54.219 "compare": false, 00:05:54.219 "compare_and_write": false, 00:05:54.219 "abort": true, 00:05:54.219 "nvme_admin": false, 00:05:54.219 "nvme_io": false 00:05:54.219 }, 00:05:54.219 "memory_domains": [ 00:05:54.219 { 00:05:54.219 "dma_device_id": "system", 00:05:54.219 "dma_device_type": 1 00:05:54.219 }, 00:05:54.219 { 00:05:54.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:54.219 "dma_device_type": 2 00:05:54.219 } 00:05:54.219 ], 00:05:54.219 "driver_specific": {} 00:05:54.219 } 00:05:54.219 ]' 00:05:54.219 20:01:38 -- rpc/rpc.sh@32 -- # jq length 00:05:54.219 20:01:38 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:54.219 20:01:38 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:54.219 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.219 20:01:38 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:54.219 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.219 20:01:38 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:54.219 20:01:38 -- rpc/rpc.sh@36 -- # jq length 00:05:54.219 20:01:38 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:54.219 00:05:54.219 real 0m0.143s 00:05:54.219 user 0m0.084s 00:05:54.219 sys 0m0.021s 00:05:54.219 20:01:38 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.219 ************************************ 00:05:54.219 END TEST rpc_plugins 00:05:54.219 ************************************ 00:05:54.219 20:01:38 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:54.219 20:01:38 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:54.219 20:01:38 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:54.219 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.477 ************************************ 00:05:54.477 START TEST rpc_trace_cmd_test 00:05:54.477 ************************************ 00:05:54.477 20:01:38 -- common/autotest_common.sh@1121 -- # rpc_trace_cmd_test 00:05:54.477 20:01:38 -- rpc/rpc.sh@40 -- # local info 00:05:54.477 20:01:38 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:54.477 20:01:38 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.477 20:01:38 -- common/autotest_common.sh@10 -- # set +x 00:05:54.477 20:01:38 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.477 20:01:38 -- rpc/rpc.sh@42 -- # info='{ 00:05:54.477 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1602483", 00:05:54.477 "tpoint_group_mask": "0x8", 00:05:54.477 "iscsi_conn": { 00:05:54.477 "mask": "0x2", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "scsi": { 00:05:54.477 "mask": "0x4", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "bdev": { 00:05:54.477 "mask": "0x8", 00:05:54.477 "tpoint_mask": "0xffffffffffffffff" 00:05:54.477 }, 00:05:54.477 "nvmf_rdma": { 00:05:54.477 "mask": "0x10", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "nvmf_tcp": { 00:05:54.477 "mask": "0x20", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "ftl": { 00:05:54.477 "mask": "0x40", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "blobfs": { 00:05:54.477 "mask": "0x80", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "dsa": { 00:05:54.477 "mask": "0x200", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "thread": { 00:05:54.477 "mask": "0x400", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "nvme_pcie": { 00:05:54.477 "mask": "0x800", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "iaa": { 00:05:54.477 "mask": "0x1000", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "nvme_tcp": { 00:05:54.477 "mask": "0x2000", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "bdev_nvme": { 00:05:54.477 "mask": "0x4000", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 }, 00:05:54.477 "sock": { 00:05:54.477 "mask": "0x8000", 00:05:54.477 "tpoint_mask": "0x0" 00:05:54.477 } 00:05:54.477 }' 00:05:54.477 20:01:38 -- rpc/rpc.sh@43 -- # jq length 00:05:54.477 20:01:38 -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:54.477 20:01:38 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:54.477 20:01:38 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:54.477 20:01:38 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:54.735 20:01:38 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:54.735 20:01:38 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:54.735 20:01:38 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:54.735 20:01:38 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:54.735 20:01:39 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:54.735 00:05:54.735 real 0m0.230s 00:05:54.735 user 0m0.180s 00:05:54.735 sys 0m0.042s 00:05:54.735 20:01:39 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:54.735 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.735 ************************************ 00:05:54.735 END TEST rpc_trace_cmd_test 00:05:54.735 ************************************ 00:05:54.735 20:01:39 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:54.735 20:01:39 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:54.735 20:01:39 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:54.735 20:01:39 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:54.735 20:01:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:54.735 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.993 ************************************ 00:05:54.993 START TEST rpc_daemon_integrity 00:05:54.993 ************************************ 00:05:54.993 20:01:39 -- common/autotest_common.sh@1121 -- # rpc_integrity 00:05:54.993 20:01:39 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:54.993 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.993 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.993 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.993 20:01:39 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:54.993 20:01:39 -- rpc/rpc.sh@13 -- # jq length 00:05:54.993 20:01:39 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:54.993 20:01:39 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:54.993 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.993 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.993 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.993 20:01:39 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:54.993 20:01:39 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:54.994 { 00:05:54.994 "name": "Malloc2", 00:05:54.994 "aliases": [ 00:05:54.994 "3d06ffef-094f-426d-bca5-61acfefbf5e4" 00:05:54.994 ], 00:05:54.994 "product_name": "Malloc disk", 00:05:54.994 "block_size": 512, 00:05:54.994 "num_blocks": 16384, 00:05:54.994 "uuid": "3d06ffef-094f-426d-bca5-61acfefbf5e4", 00:05:54.994 "assigned_rate_limits": { 00:05:54.994 "rw_ios_per_sec": 0, 00:05:54.994 "rw_mbytes_per_sec": 0, 00:05:54.994 "r_mbytes_per_sec": 0, 00:05:54.994 "w_mbytes_per_sec": 0 00:05:54.994 }, 00:05:54.994 "claimed": false, 00:05:54.994 "zoned": false, 00:05:54.994 "supported_io_types": { 00:05:54.994 "read": true, 00:05:54.994 "write": true, 00:05:54.994 "unmap": true, 00:05:54.994 "write_zeroes": true, 00:05:54.994 "flush": true, 00:05:54.994 "reset": true, 00:05:54.994 "compare": false, 00:05:54.994 "compare_and_write": false, 00:05:54.994 "abort": true, 00:05:54.994 "nvme_admin": false, 00:05:54.994 "nvme_io": false 00:05:54.994 }, 00:05:54.994 "memory_domains": [ 00:05:54.994 { 00:05:54.994 "dma_device_id": "system", 00:05:54.994 "dma_device_type": 1 00:05:54.994 }, 00:05:54.994 { 00:05:54.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:54.994 "dma_device_type": 2 00:05:54.994 } 00:05:54.994 ], 00:05:54.994 "driver_specific": {} 00:05:54.994 } 00:05:54.994 ]' 00:05:54.994 20:01:39 -- rpc/rpc.sh@17 -- # jq length 00:05:54.994 20:01:39 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:54.994 20:01:39 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 [2024-04-26 20:01:39.308619] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:54.994 [2024-04-26 20:01:39.308651] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:54.994 [2024-04-26 20:01:39.308667] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x50226f0 00:05:54.994 [2024-04-26 20:01:39.308677] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:54.994 [2024-04-26 20:01:39.309426] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:54.994 [2024-04-26 20:01:39.309448] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:54.994 Passthru0 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:54.994 { 00:05:54.994 "name": "Malloc2", 00:05:54.994 "aliases": [ 00:05:54.994 "3d06ffef-094f-426d-bca5-61acfefbf5e4" 00:05:54.994 ], 00:05:54.994 "product_name": "Malloc disk", 00:05:54.994 "block_size": 512, 00:05:54.994 "num_blocks": 16384, 00:05:54.994 "uuid": "3d06ffef-094f-426d-bca5-61acfefbf5e4", 00:05:54.994 "assigned_rate_limits": { 00:05:54.994 "rw_ios_per_sec": 0, 00:05:54.994 "rw_mbytes_per_sec": 0, 00:05:54.994 "r_mbytes_per_sec": 0, 00:05:54.994 "w_mbytes_per_sec": 0 00:05:54.994 }, 00:05:54.994 "claimed": true, 00:05:54.994 "claim_type": "exclusive_write", 00:05:54.994 "zoned": false, 00:05:54.994 "supported_io_types": { 00:05:54.994 "read": true, 00:05:54.994 "write": true, 00:05:54.994 "unmap": true, 00:05:54.994 "write_zeroes": true, 00:05:54.994 "flush": true, 00:05:54.994 "reset": true, 00:05:54.994 "compare": false, 00:05:54.994 "compare_and_write": false, 00:05:54.994 "abort": true, 00:05:54.994 "nvme_admin": false, 00:05:54.994 "nvme_io": false 00:05:54.994 }, 00:05:54.994 "memory_domains": [ 00:05:54.994 { 00:05:54.994 "dma_device_id": "system", 00:05:54.994 "dma_device_type": 1 00:05:54.994 }, 00:05:54.994 { 00:05:54.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:54.994 "dma_device_type": 2 00:05:54.994 } 00:05:54.994 ], 00:05:54.994 "driver_specific": {} 00:05:54.994 }, 00:05:54.994 { 00:05:54.994 "name": "Passthru0", 00:05:54.994 "aliases": [ 00:05:54.994 "761d8d08-e709-5eab-b2b1-89296f350a14" 00:05:54.994 ], 00:05:54.994 "product_name": "passthru", 00:05:54.994 "block_size": 512, 00:05:54.994 "num_blocks": 16384, 00:05:54.994 "uuid": "761d8d08-e709-5eab-b2b1-89296f350a14", 00:05:54.994 "assigned_rate_limits": { 00:05:54.994 "rw_ios_per_sec": 0, 00:05:54.994 "rw_mbytes_per_sec": 0, 00:05:54.994 "r_mbytes_per_sec": 0, 00:05:54.994 "w_mbytes_per_sec": 0 00:05:54.994 }, 00:05:54.994 "claimed": false, 00:05:54.994 "zoned": false, 00:05:54.994 "supported_io_types": { 00:05:54.994 "read": true, 00:05:54.994 "write": true, 00:05:54.994 "unmap": true, 00:05:54.994 "write_zeroes": true, 00:05:54.994 "flush": true, 00:05:54.994 "reset": true, 00:05:54.994 "compare": false, 00:05:54.994 "compare_and_write": false, 00:05:54.994 "abort": true, 00:05:54.994 "nvme_admin": false, 00:05:54.994 "nvme_io": false 00:05:54.994 }, 00:05:54.994 "memory_domains": [ 00:05:54.994 { 00:05:54.994 "dma_device_id": "system", 00:05:54.994 "dma_device_type": 1 00:05:54.994 }, 00:05:54.994 { 00:05:54.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:54.994 "dma_device_type": 2 00:05:54.994 } 00:05:54.994 ], 00:05:54.994 "driver_specific": { 00:05:54.994 "passthru": { 00:05:54.994 "name": "Passthru0", 00:05:54.994 "base_bdev_name": "Malloc2" 00:05:54.994 } 00:05:54.994 } 00:05:54.994 } 00:05:54.994 ]' 00:05:54.994 20:01:39 -- rpc/rpc.sh@21 -- # jq length 00:05:54.994 20:01:39 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:54.994 20:01:39 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:54.994 20:01:39 -- common/autotest_common.sh@559 -- # xtrace_disable 00:05:54.994 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:54.994 20:01:39 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:05:54.994 20:01:39 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:54.994 20:01:39 -- rpc/rpc.sh@26 -- # jq length 00:05:55.253 20:01:39 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:55.253 00:05:55.253 real 0m0.262s 00:05:55.253 user 0m0.161s 00:05:55.253 sys 0m0.039s 00:05:55.253 20:01:39 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:55.253 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.253 ************************************ 00:05:55.253 END TEST rpc_daemon_integrity 00:05:55.253 ************************************ 00:05:55.253 20:01:39 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:55.253 20:01:39 -- rpc/rpc.sh@84 -- # killprocess 1602483 00:05:55.253 20:01:39 -- common/autotest_common.sh@946 -- # '[' -z 1602483 ']' 00:05:55.253 20:01:39 -- common/autotest_common.sh@950 -- # kill -0 1602483 00:05:55.253 20:01:39 -- common/autotest_common.sh@951 -- # uname 00:05:55.253 20:01:39 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:05:55.253 20:01:39 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1602483 00:05:55.253 20:01:39 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:05:55.253 20:01:39 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:05:55.253 20:01:39 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1602483' 00:05:55.253 killing process with pid 1602483 00:05:55.253 20:01:39 -- common/autotest_common.sh@965 -- # kill 1602483 00:05:55.253 20:01:39 -- common/autotest_common.sh@970 -- # wait 1602483 00:05:55.512 00:05:55.512 real 0m2.966s 00:05:55.512 user 0m3.769s 00:05:55.512 sys 0m0.988s 00:05:55.512 20:01:39 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:05:55.512 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.512 ************************************ 00:05:55.512 END TEST rpc 00:05:55.512 ************************************ 00:05:55.512 20:01:39 -- spdk/autotest.sh@166 -- # run_test skip_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:55.512 20:01:39 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:55.512 20:01:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:55.512 20:01:39 -- common/autotest_common.sh@10 -- # set +x 00:05:55.770 ************************************ 00:05:55.770 START TEST skip_rpc 00:05:55.770 ************************************ 00:05:55.770 20:01:40 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:55.770 * Looking for test storage... 00:05:55.771 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:55.771 20:01:40 -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:05:55.771 20:01:40 -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:05:55.771 20:01:40 -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:55.771 20:01:40 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:05:55.771 20:01:40 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:05:55.771 20:01:40 -- common/autotest_common.sh@10 -- # set +x 00:05:56.028 ************************************ 00:05:56.028 START TEST skip_rpc 00:05:56.028 ************************************ 00:05:56.028 20:01:40 -- common/autotest_common.sh@1121 -- # test_skip_rpc 00:05:56.028 20:01:40 -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:56.028 20:01:40 -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1603138 00:05:56.028 20:01:40 -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:56.028 20:01:40 -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:56.028 [2024-04-26 20:01:40.325230] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:05:56.028 [2024-04-26 20:01:40.325281] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603138 ] 00:05:56.028 EAL: No free 2048 kB hugepages reported on node 1 00:05:56.028 [2024-04-26 20:01:40.405706] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:56.356 [2024-04-26 20:01:40.491990] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:01.622 20:01:45 -- common/autotest_common.sh@648 -- # local es=0 00:06:01.622 20:01:45 -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:01.622 20:01:45 -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:06:01.622 20:01:45 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:01.622 20:01:45 -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:06:01.622 20:01:45 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:01.622 20:01:45 -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:06:01.622 20:01:45 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:01.622 20:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:01.622 20:01:45 -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:01.622 20:01:45 -- common/autotest_common.sh@651 -- # es=1 00:06:01.622 20:01:45 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:01.622 20:01:45 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:01.622 20:01:45 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@23 -- # killprocess 1603138 00:06:01.622 20:01:45 -- common/autotest_common.sh@946 -- # '[' -z 1603138 ']' 00:06:01.622 20:01:45 -- common/autotest_common.sh@950 -- # kill -0 1603138 00:06:01.622 20:01:45 -- common/autotest_common.sh@951 -- # uname 00:06:01.622 20:01:45 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:01.622 20:01:45 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1603138 00:06:01.622 20:01:45 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:01.622 20:01:45 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:01.622 20:01:45 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1603138' 00:06:01.622 killing process with pid 1603138 00:06:01.622 20:01:45 -- common/autotest_common.sh@965 -- # kill 1603138 00:06:01.622 20:01:45 -- common/autotest_common.sh@970 -- # wait 1603138 00:06:01.622 00:06:01.622 real 0m5.376s 00:06:01.622 user 0m5.105s 00:06:01.622 sys 0m0.291s 00:06:01.622 20:01:45 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:01.622 20:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:01.622 ************************************ 00:06:01.622 END TEST skip_rpc 00:06:01.622 ************************************ 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:01.622 20:01:45 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:01.622 20:01:45 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:01.622 20:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:01.622 ************************************ 00:06:01.622 START TEST skip_rpc_with_json 00:06:01.622 ************************************ 00:06:01.622 20:01:45 -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_json 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1603963 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:01.622 20:01:45 -- rpc/skip_rpc.sh@31 -- # waitforlisten 1603963 00:06:01.622 20:01:45 -- common/autotest_common.sh@827 -- # '[' -z 1603963 ']' 00:06:01.622 20:01:45 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.622 20:01:45 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:01.622 20:01:45 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.622 20:01:45 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:01.622 20:01:45 -- common/autotest_common.sh@10 -- # set +x 00:06:01.622 [2024-04-26 20:01:45.905341] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:01.622 [2024-04-26 20:01:45.905418] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1603963 ] 00:06:01.622 EAL: No free 2048 kB hugepages reported on node 1 00:06:01.622 [2024-04-26 20:01:45.989157] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.880 [2024-04-26 20:01:46.077934] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:02.444 20:01:46 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:02.444 20:01:46 -- common/autotest_common.sh@860 -- # return 0 00:06:02.444 20:01:46 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:02.444 20:01:46 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:02.444 20:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:02.444 [2024-04-26 20:01:46.725711] nvmf_rpc.c:2513:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:02.444 request: 00:06:02.444 { 00:06:02.444 "trtype": "tcp", 00:06:02.444 "method": "nvmf_get_transports", 00:06:02.444 "req_id": 1 00:06:02.444 } 00:06:02.444 Got JSON-RPC error response 00:06:02.444 response: 00:06:02.444 { 00:06:02.444 "code": -19, 00:06:02.444 "message": "No such device" 00:06:02.444 } 00:06:02.444 20:01:46 -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:06:02.444 20:01:46 -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:02.444 20:01:46 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:02.444 20:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:02.444 [2024-04-26 20:01:46.737798] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:02.444 20:01:46 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:02.444 20:01:46 -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:02.444 20:01:46 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:02.444 20:01:46 -- common/autotest_common.sh@10 -- # set +x 00:06:02.702 20:01:46 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:02.702 20:01:46 -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:02.702 { 00:06:02.702 "subsystems": [ 00:06:02.702 { 00:06:02.702 "subsystem": "scheduler", 00:06:02.702 "config": [ 00:06:02.702 { 00:06:02.702 "method": "framework_set_scheduler", 00:06:02.703 "params": { 00:06:02.703 "name": "static" 00:06:02.703 } 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "vmd", 00:06:02.703 "config": [] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "sock", 00:06:02.703 "config": [ 00:06:02.703 { 00:06:02.703 "method": "sock_impl_set_options", 00:06:02.703 "params": { 00:06:02.703 "impl_name": "posix", 00:06:02.703 "recv_buf_size": 2097152, 00:06:02.703 "send_buf_size": 2097152, 00:06:02.703 "enable_recv_pipe": true, 00:06:02.703 "enable_quickack": false, 00:06:02.703 "enable_placement_id": 0, 00:06:02.703 "enable_zerocopy_send_server": true, 00:06:02.703 "enable_zerocopy_send_client": false, 00:06:02.703 "zerocopy_threshold": 0, 00:06:02.703 "tls_version": 0, 00:06:02.703 "enable_ktls": false 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "sock_impl_set_options", 00:06:02.703 "params": { 00:06:02.703 "impl_name": "ssl", 00:06:02.703 "recv_buf_size": 4096, 00:06:02.703 "send_buf_size": 4096, 00:06:02.703 "enable_recv_pipe": true, 00:06:02.703 "enable_quickack": false, 00:06:02.703 "enable_placement_id": 0, 00:06:02.703 "enable_zerocopy_send_server": true, 00:06:02.703 "enable_zerocopy_send_client": false, 00:06:02.703 "zerocopy_threshold": 0, 00:06:02.703 "tls_version": 0, 00:06:02.703 "enable_ktls": false 00:06:02.703 } 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "iobuf", 00:06:02.703 "config": [ 00:06:02.703 { 00:06:02.703 "method": "iobuf_set_options", 00:06:02.703 "params": { 00:06:02.703 "small_pool_count": 8192, 00:06:02.703 "large_pool_count": 1024, 00:06:02.703 "small_bufsize": 8192, 00:06:02.703 "large_bufsize": 135168 00:06:02.703 } 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "keyring", 00:06:02.703 "config": [] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "vfio_user_target", 00:06:02.703 "config": null 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "accel", 00:06:02.703 "config": [ 00:06:02.703 { 00:06:02.703 "method": "accel_set_options", 00:06:02.703 "params": { 00:06:02.703 "small_cache_size": 128, 00:06:02.703 "large_cache_size": 16, 00:06:02.703 "task_count": 2048, 00:06:02.703 "sequence_count": 2048, 00:06:02.703 "buf_count": 2048 00:06:02.703 } 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "bdev", 00:06:02.703 "config": [ 00:06:02.703 { 00:06:02.703 "method": "bdev_set_options", 00:06:02.703 "params": { 00:06:02.703 "bdev_io_pool_size": 65535, 00:06:02.703 "bdev_io_cache_size": 256, 00:06:02.703 "bdev_auto_examine": true, 00:06:02.703 "iobuf_small_cache_size": 128, 00:06:02.703 "iobuf_large_cache_size": 16 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "bdev_raid_set_options", 00:06:02.703 "params": { 00:06:02.703 "process_window_size_kb": 1024 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "bdev_nvme_set_options", 00:06:02.703 "params": { 00:06:02.703 "action_on_timeout": "none", 00:06:02.703 "timeout_us": 0, 00:06:02.703 "timeout_admin_us": 0, 00:06:02.703 "keep_alive_timeout_ms": 10000, 00:06:02.703 "arbitration_burst": 0, 00:06:02.703 "low_priority_weight": 0, 00:06:02.703 "medium_priority_weight": 0, 00:06:02.703 "high_priority_weight": 0, 00:06:02.703 "nvme_adminq_poll_period_us": 10000, 00:06:02.703 "nvme_ioq_poll_period_us": 0, 00:06:02.703 "io_queue_requests": 0, 00:06:02.703 "delay_cmd_submit": true, 00:06:02.703 "transport_retry_count": 4, 00:06:02.703 "bdev_retry_count": 3, 00:06:02.703 "transport_ack_timeout": 0, 00:06:02.703 "ctrlr_loss_timeout_sec": 0, 00:06:02.703 "reconnect_delay_sec": 0, 00:06:02.703 "fast_io_fail_timeout_sec": 0, 00:06:02.703 "disable_auto_failback": false, 00:06:02.703 "generate_uuids": false, 00:06:02.703 "transport_tos": 0, 00:06:02.703 "nvme_error_stat": false, 00:06:02.703 "rdma_srq_size": 0, 00:06:02.703 "io_path_stat": false, 00:06:02.703 "allow_accel_sequence": false, 00:06:02.703 "rdma_max_cq_size": 0, 00:06:02.703 "rdma_cm_event_timeout_ms": 0, 00:06:02.703 "dhchap_digests": [ 00:06:02.703 "sha256", 00:06:02.703 "sha384", 00:06:02.703 "sha512" 00:06:02.703 ], 00:06:02.703 "dhchap_dhgroups": [ 00:06:02.703 "null", 00:06:02.703 "ffdhe2048", 00:06:02.703 "ffdhe3072", 00:06:02.703 "ffdhe4096", 00:06:02.703 "ffdhe6144", 00:06:02.703 "ffdhe8192" 00:06:02.703 ] 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "bdev_nvme_set_hotplug", 00:06:02.703 "params": { 00:06:02.703 "period_us": 100000, 00:06:02.703 "enable": false 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "bdev_iscsi_set_options", 00:06:02.703 "params": { 00:06:02.703 "timeout_sec": 30 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "bdev_wait_for_examine" 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "nvmf", 00:06:02.703 "config": [ 00:06:02.703 { 00:06:02.703 "method": "nvmf_set_config", 00:06:02.703 "params": { 00:06:02.703 "discovery_filter": "match_any", 00:06:02.703 "admin_cmd_passthru": { 00:06:02.703 "identify_ctrlr": false 00:06:02.703 } 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "nvmf_set_max_subsystems", 00:06:02.703 "params": { 00:06:02.703 "max_subsystems": 1024 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "nvmf_set_crdt", 00:06:02.703 "params": { 00:06:02.703 "crdt1": 0, 00:06:02.703 "crdt2": 0, 00:06:02.703 "crdt3": 0 00:06:02.703 } 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "method": "nvmf_create_transport", 00:06:02.703 "params": { 00:06:02.703 "trtype": "TCP", 00:06:02.703 "max_queue_depth": 128, 00:06:02.703 "max_io_qpairs_per_ctrlr": 127, 00:06:02.703 "in_capsule_data_size": 4096, 00:06:02.703 "max_io_size": 131072, 00:06:02.703 "io_unit_size": 131072, 00:06:02.703 "max_aq_depth": 128, 00:06:02.703 "num_shared_buffers": 511, 00:06:02.703 "buf_cache_size": 4294967295, 00:06:02.703 "dif_insert_or_strip": false, 00:06:02.703 "zcopy": false, 00:06:02.703 "c2h_success": true, 00:06:02.703 "sock_priority": 0, 00:06:02.703 "abort_timeout_sec": 1, 00:06:02.703 "ack_timeout": 0, 00:06:02.703 "data_wr_pool_size": 0 00:06:02.703 } 00:06:02.703 } 00:06:02.703 ] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "nbd", 00:06:02.703 "config": [] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "ublk", 00:06:02.703 "config": [] 00:06:02.703 }, 00:06:02.703 { 00:06:02.703 "subsystem": "vhost_blk", 00:06:02.704 "config": [] 00:06:02.704 }, 00:06:02.704 { 00:06:02.704 "subsystem": "scsi", 00:06:02.704 "config": null 00:06:02.704 }, 00:06:02.704 { 00:06:02.704 "subsystem": "iscsi", 00:06:02.704 "config": [ 00:06:02.704 { 00:06:02.704 "method": "iscsi_set_options", 00:06:02.704 "params": { 00:06:02.704 "node_base": "iqn.2016-06.io.spdk", 00:06:02.704 "max_sessions": 128, 00:06:02.704 "max_connections_per_session": 2, 00:06:02.704 "max_queue_depth": 64, 00:06:02.704 "default_time2wait": 2, 00:06:02.704 "default_time2retain": 20, 00:06:02.704 "first_burst_length": 8192, 00:06:02.704 "immediate_data": true, 00:06:02.704 "allow_duplicated_isid": false, 00:06:02.704 "error_recovery_level": 0, 00:06:02.704 "nop_timeout": 60, 00:06:02.704 "nop_in_interval": 30, 00:06:02.704 "disable_chap": false, 00:06:02.704 "require_chap": false, 00:06:02.704 "mutual_chap": false, 00:06:02.704 "chap_group": 0, 00:06:02.704 "max_large_datain_per_connection": 64, 00:06:02.704 "max_r2t_per_connection": 4, 00:06:02.704 "pdu_pool_size": 36864, 00:06:02.704 "immediate_data_pool_size": 16384, 00:06:02.704 "data_out_pool_size": 2048 00:06:02.704 } 00:06:02.704 } 00:06:02.704 ] 00:06:02.704 }, 00:06:02.704 { 00:06:02.704 "subsystem": "vhost_scsi", 00:06:02.704 "config": [] 00:06:02.704 } 00:06:02.704 ] 00:06:02.704 } 00:06:02.704 20:01:46 -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:02.704 20:01:46 -- rpc/skip_rpc.sh@40 -- # killprocess 1603963 00:06:02.704 20:01:46 -- common/autotest_common.sh@946 -- # '[' -z 1603963 ']' 00:06:02.704 20:01:46 -- common/autotest_common.sh@950 -- # kill -0 1603963 00:06:02.704 20:01:46 -- common/autotest_common.sh@951 -- # uname 00:06:02.704 20:01:46 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:02.704 20:01:46 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1603963 00:06:02.704 20:01:46 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:02.704 20:01:46 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:02.704 20:01:46 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1603963' 00:06:02.704 killing process with pid 1603963 00:06:02.704 20:01:46 -- common/autotest_common.sh@965 -- # kill 1603963 00:06:02.704 20:01:46 -- common/autotest_common.sh@970 -- # wait 1603963 00:06:02.962 20:01:47 -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1604145 00:06:02.962 20:01:47 -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:02.962 20:01:47 -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:08.225 20:01:52 -- rpc/skip_rpc.sh@50 -- # killprocess 1604145 00:06:08.225 20:01:52 -- common/autotest_common.sh@946 -- # '[' -z 1604145 ']' 00:06:08.225 20:01:52 -- common/autotest_common.sh@950 -- # kill -0 1604145 00:06:08.225 20:01:52 -- common/autotest_common.sh@951 -- # uname 00:06:08.225 20:01:52 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:08.225 20:01:52 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1604145 00:06:08.225 20:01:52 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:08.225 20:01:52 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:08.225 20:01:52 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1604145' 00:06:08.225 killing process with pid 1604145 00:06:08.225 20:01:52 -- common/autotest_common.sh@965 -- # kill 1604145 00:06:08.225 20:01:52 -- common/autotest_common.sh@970 -- # wait 1604145 00:06:08.483 20:01:52 -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:08.483 20:01:52 -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/log.txt 00:06:08.483 00:06:08.483 real 0m6.819s 00:06:08.483 user 0m6.528s 00:06:08.483 sys 0m0.707s 00:06:08.483 20:01:52 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:08.483 20:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.483 ************************************ 00:06:08.483 END TEST skip_rpc_with_json 00:06:08.483 ************************************ 00:06:08.483 20:01:52 -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:06:08.483 20:01:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:08.483 20:01:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:08.483 20:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.483 ************************************ 00:06:08.483 START TEST skip_rpc_with_delay 00:06:08.483 ************************************ 00:06:08.483 20:01:52 -- common/autotest_common.sh@1121 -- # test_skip_rpc_with_delay 00:06:08.483 20:01:52 -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:08.483 20:01:52 -- common/autotest_common.sh@648 -- # local es=0 00:06:08.483 20:01:52 -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:08.483 20:01:52 -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:08.483 20:01:52 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:08.483 20:01:52 -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:08.483 20:01:52 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:08.483 20:01:52 -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:08.483 20:01:52 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:08.483 20:01:52 -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:08.483 20:01:52 -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:08.483 20:01:52 -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:06:08.742 [2024-04-26 20:01:52.930393] app.c: 751:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:06:08.742 [2024-04-26 20:01:52.930539] app.c: 630:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:06:08.742 20:01:52 -- common/autotest_common.sh@651 -- # es=1 00:06:08.742 20:01:52 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:08.742 20:01:52 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:08.742 20:01:52 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:08.742 00:06:08.742 real 0m0.044s 00:06:08.742 user 0m0.020s 00:06:08.742 sys 0m0.024s 00:06:08.742 20:01:52 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:08.742 20:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.742 ************************************ 00:06:08.742 END TEST skip_rpc_with_delay 00:06:08.742 ************************************ 00:06:08.742 20:01:52 -- rpc/skip_rpc.sh@77 -- # uname 00:06:08.742 20:01:52 -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:06:08.742 20:01:52 -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:06:08.742 20:01:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:08.742 20:01:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:08.742 20:01:52 -- common/autotest_common.sh@10 -- # set +x 00:06:08.742 ************************************ 00:06:08.742 START TEST exit_on_failed_rpc_init 00:06:08.742 ************************************ 00:06:08.742 20:01:53 -- common/autotest_common.sh@1121 -- # test_exit_on_failed_rpc_init 00:06:08.742 20:01:53 -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1604922 00:06:08.742 20:01:53 -- rpc/skip_rpc.sh@63 -- # waitforlisten 1604922 00:06:08.742 20:01:53 -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:08.742 20:01:53 -- common/autotest_common.sh@827 -- # '[' -z 1604922 ']' 00:06:08.742 20:01:53 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.742 20:01:53 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:08.742 20:01:53 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.742 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.742 20:01:53 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:08.742 20:01:53 -- common/autotest_common.sh@10 -- # set +x 00:06:08.742 [2024-04-26 20:01:53.169618] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:08.742 [2024-04-26 20:01:53.169694] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1604922 ] 00:06:09.001 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.001 [2024-04-26 20:01:53.255411] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.001 [2024-04-26 20:01:53.346061] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.566 20:01:53 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:09.566 20:01:53 -- common/autotest_common.sh@860 -- # return 0 00:06:09.566 20:01:53 -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:09.566 20:01:53 -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:09.566 20:01:53 -- common/autotest_common.sh@648 -- # local es=0 00:06:09.566 20:01:53 -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:09.566 20:01:53 -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.566 20:01:53 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:09.566 20:01:54 -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.566 20:01:53 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:09.566 20:01:54 -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.566 20:01:53 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:09.566 20:01:54 -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:09.566 20:01:54 -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:06:09.566 20:01:54 -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:06:09.825 [2024-04-26 20:01:54.028221] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:09.825 [2024-04-26 20:01:54.028296] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605100 ] 00:06:09.825 EAL: No free 2048 kB hugepages reported on node 1 00:06:09.825 [2024-04-26 20:01:54.108571] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.825 [2024-04-26 20:01:54.187412] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:09.825 [2024-04-26 20:01:54.187505] rpc.c: 181:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:06:09.825 [2024-04-26 20:01:54.187518] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:06:09.825 [2024-04-26 20:01:54.187529] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:09.825 20:01:54 -- common/autotest_common.sh@651 -- # es=234 00:06:09.825 20:01:54 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:09.825 20:01:54 -- common/autotest_common.sh@660 -- # es=106 00:06:09.825 20:01:54 -- common/autotest_common.sh@661 -- # case "$es" in 00:06:09.825 20:01:54 -- common/autotest_common.sh@668 -- # es=1 00:06:09.825 20:01:54 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:09.825 20:01:54 -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:06:10.083 20:01:54 -- rpc/skip_rpc.sh@70 -- # killprocess 1604922 00:06:10.083 20:01:54 -- common/autotest_common.sh@946 -- # '[' -z 1604922 ']' 00:06:10.083 20:01:54 -- common/autotest_common.sh@950 -- # kill -0 1604922 00:06:10.083 20:01:54 -- common/autotest_common.sh@951 -- # uname 00:06:10.083 20:01:54 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:10.083 20:01:54 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1604922 00:06:10.083 20:01:54 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:10.083 20:01:54 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:10.083 20:01:54 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1604922' 00:06:10.083 killing process with pid 1604922 00:06:10.083 20:01:54 -- common/autotest_common.sh@965 -- # kill 1604922 00:06:10.083 20:01:54 -- common/autotest_common.sh@970 -- # wait 1604922 00:06:10.343 00:06:10.343 real 0m1.512s 00:06:10.343 user 0m1.661s 00:06:10.343 sys 0m0.491s 00:06:10.343 20:01:54 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:10.343 20:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:10.343 ************************************ 00:06:10.343 END TEST exit_on_failed_rpc_init 00:06:10.343 ************************************ 00:06:10.343 20:01:54 -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/config.json 00:06:10.343 00:06:10.343 real 0m14.652s 00:06:10.343 user 0m13.658s 00:06:10.343 sys 0m2.041s 00:06:10.343 20:01:54 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:10.343 20:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:10.343 ************************************ 00:06:10.343 END TEST skip_rpc 00:06:10.343 ************************************ 00:06:10.343 20:01:54 -- spdk/autotest.sh@167 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:10.343 20:01:54 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:10.343 20:01:54 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:10.343 20:01:54 -- common/autotest_common.sh@10 -- # set +x 00:06:10.603 ************************************ 00:06:10.603 START TEST rpc_client 00:06:10.603 ************************************ 00:06:10.603 20:01:54 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:06:10.603 * Looking for test storage... 00:06:10.603 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:06:10.603 20:01:55 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:06:10.603 OK 00:06:10.603 20:01:55 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:06:10.603 00:06:10.603 real 0m0.124s 00:06:10.603 user 0m0.045s 00:06:10.603 sys 0m0.088s 00:06:10.603 20:01:55 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:10.603 20:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:10.603 ************************************ 00:06:10.603 END TEST rpc_client 00:06:10.603 ************************************ 00:06:10.862 20:01:55 -- spdk/autotest.sh@168 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:10.862 20:01:55 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:10.862 20:01:55 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:10.862 20:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:10.862 ************************************ 00:06:10.862 START TEST json_config 00:06:10.862 ************************************ 00:06:10.862 20:01:55 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:06:11.121 20:01:55 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:11.121 20:01:55 -- nvmf/common.sh@7 -- # uname -s 00:06:11.121 20:01:55 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:11.121 20:01:55 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:11.121 20:01:55 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:11.121 20:01:55 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:11.122 20:01:55 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:11.122 20:01:55 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:11.122 20:01:55 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:11.122 20:01:55 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:11.122 20:01:55 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:11.122 20:01:55 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:11.122 20:01:55 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:06:11.122 20:01:55 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:06:11.122 20:01:55 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:11.122 20:01:55 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:11.122 20:01:55 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:11.122 20:01:55 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:11.122 20:01:55 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:11.122 20:01:55 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:11.122 20:01:55 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:11.122 20:01:55 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:11.122 20:01:55 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.122 20:01:55 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.122 20:01:55 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.122 20:01:55 -- paths/export.sh@5 -- # export PATH 00:06:11.122 20:01:55 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.122 20:01:55 -- nvmf/common.sh@47 -- # : 0 00:06:11.122 20:01:55 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:11.122 20:01:55 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:11.122 20:01:55 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:11.122 20:01:55 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:11.122 20:01:55 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:11.122 20:01:55 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:11.122 20:01:55 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:11.122 20:01:55 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:11.122 20:01:55 -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:11.122 20:01:55 -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:06:11.122 20:01:55 -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:06:11.122 20:01:55 -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:06:11.122 20:01:55 -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:06:11.122 20:01:55 -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:06:11.122 WARNING: No tests are enabled so not running JSON configuration tests 00:06:11.122 20:01:55 -- json_config/json_config.sh@28 -- # exit 0 00:06:11.122 00:06:11.122 real 0m0.114s 00:06:11.122 user 0m0.049s 00:06:11.122 sys 0m0.067s 00:06:11.122 20:01:55 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:11.122 20:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:11.122 ************************************ 00:06:11.122 END TEST json_config 00:06:11.122 ************************************ 00:06:11.122 20:01:55 -- spdk/autotest.sh@169 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:11.122 20:01:55 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:11.122 20:01:55 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:11.122 20:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:11.122 ************************************ 00:06:11.122 START TEST json_config_extra_key 00:06:11.122 ************************************ 00:06:11.122 20:01:55 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:11.381 20:01:55 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:06:11.381 20:01:55 -- nvmf/common.sh@7 -- # uname -s 00:06:11.381 20:01:55 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:11.381 20:01:55 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:11.381 20:01:55 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:11.381 20:01:55 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:11.381 20:01:55 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:11.381 20:01:55 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:11.381 20:01:55 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:11.381 20:01:55 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:11.381 20:01:55 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:11.381 20:01:55 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:11.381 20:01:55 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:8023d868-666a-e711-906e-0017a4403562 00:06:11.381 20:01:55 -- nvmf/common.sh@18 -- # NVME_HOSTID=8023d868-666a-e711-906e-0017a4403562 00:06:11.381 20:01:55 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:11.381 20:01:55 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:11.381 20:01:55 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:11.381 20:01:55 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:11.381 20:01:55 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:06:11.381 20:01:55 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:11.381 20:01:55 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:11.381 20:01:55 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:11.381 20:01:55 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.381 20:01:55 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.381 20:01:55 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.381 20:01:55 -- paths/export.sh@5 -- # export PATH 00:06:11.381 20:01:55 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:11.381 20:01:55 -- nvmf/common.sh@47 -- # : 0 00:06:11.382 20:01:55 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:11.382 20:01:55 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:11.382 20:01:55 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:11.382 20:01:55 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:11.382 20:01:55 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:11.382 20:01:55 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:11.382 20:01:55 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:11.382 20:01:55 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/common.sh 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:11.382 INFO: launching applications... 00:06:11.382 20:01:55 -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:11.382 20:01:55 -- json_config/common.sh@9 -- # local app=target 00:06:11.382 20:01:55 -- json_config/common.sh@10 -- # shift 00:06:11.382 20:01:55 -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:11.382 20:01:55 -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:11.382 20:01:55 -- json_config/common.sh@15 -- # local app_extra_params= 00:06:11.382 20:01:55 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:11.382 20:01:55 -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:11.382 20:01:55 -- json_config/common.sh@22 -- # app_pid["$app"]=1605446 00:06:11.382 20:01:55 -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:11.382 Waiting for target to run... 00:06:11.382 20:01:55 -- json_config/common.sh@25 -- # waitforlisten 1605446 /var/tmp/spdk_tgt.sock 00:06:11.382 20:01:55 -- common/autotest_common.sh@827 -- # '[' -z 1605446 ']' 00:06:11.382 20:01:55 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:11.382 20:01:55 -- json_config/common.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:06:11.382 20:01:55 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:11.382 20:01:55 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:11.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:11.382 20:01:55 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:11.382 20:01:55 -- common/autotest_common.sh@10 -- # set +x 00:06:11.382 [2024-04-26 20:01:55.654468] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:11.382 [2024-04-26 20:01:55.654568] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605446 ] 00:06:11.382 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.641 [2024-04-26 20:01:55.962898] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.641 [2024-04-26 20:01:56.032927] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.208 20:01:56 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:12.208 20:01:56 -- common/autotest_common.sh@860 -- # return 0 00:06:12.208 20:01:56 -- json_config/common.sh@26 -- # echo '' 00:06:12.208 00:06:12.208 20:01:56 -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:12.208 INFO: shutting down applications... 00:06:12.208 20:01:56 -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:12.208 20:01:56 -- json_config/common.sh@31 -- # local app=target 00:06:12.208 20:01:56 -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:12.208 20:01:56 -- json_config/common.sh@35 -- # [[ -n 1605446 ]] 00:06:12.208 20:01:56 -- json_config/common.sh@38 -- # kill -SIGINT 1605446 00:06:12.208 20:01:56 -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:12.208 20:01:56 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:12.208 20:01:56 -- json_config/common.sh@41 -- # kill -0 1605446 00:06:12.208 20:01:56 -- json_config/common.sh@45 -- # sleep 0.5 00:06:12.776 20:01:56 -- json_config/common.sh@40 -- # (( i++ )) 00:06:12.776 20:01:56 -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:12.776 20:01:56 -- json_config/common.sh@41 -- # kill -0 1605446 00:06:12.776 20:01:56 -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:12.776 20:01:56 -- json_config/common.sh@43 -- # break 00:06:12.776 20:01:56 -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:12.776 20:01:56 -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:12.776 SPDK target shutdown done 00:06:12.776 20:01:56 -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:12.776 Success 00:06:12.776 00:06:12.776 real 0m1.454s 00:06:12.776 user 0m1.215s 00:06:12.776 sys 0m0.414s 00:06:12.776 20:01:56 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:12.776 20:01:56 -- common/autotest_common.sh@10 -- # set +x 00:06:12.776 ************************************ 00:06:12.776 END TEST json_config_extra_key 00:06:12.776 ************************************ 00:06:12.776 20:01:57 -- spdk/autotest.sh@170 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:12.776 20:01:57 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:12.776 20:01:57 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:12.776 20:01:57 -- common/autotest_common.sh@10 -- # set +x 00:06:12.776 ************************************ 00:06:12.776 START TEST alias_rpc 00:06:12.776 ************************************ 00:06:12.776 20:01:57 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:13.034 * Looking for test storage... 00:06:13.034 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:06:13.034 20:01:57 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:13.034 20:01:57 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1605678 00:06:13.034 20:01:57 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1605678 00:06:13.034 20:01:57 -- common/autotest_common.sh@827 -- # '[' -z 1605678 ']' 00:06:13.034 20:01:57 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:13.034 20:01:57 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:13.034 20:01:57 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:13.034 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:13.034 20:01:57 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:13.034 20:01:57 -- common/autotest_common.sh@10 -- # set +x 00:06:13.034 20:01:57 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:13.034 [2024-04-26 20:01:57.319041] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:13.034 [2024-04-26 20:01:57.319114] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605678 ] 00:06:13.034 EAL: No free 2048 kB hugepages reported on node 1 00:06:13.034 [2024-04-26 20:01:57.402802] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:13.291 [2024-04-26 20:01:57.490905] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.856 20:01:58 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:13.856 20:01:58 -- common/autotest_common.sh@860 -- # return 0 00:06:13.856 20:01:58 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:14.113 20:01:58 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1605678 00:06:14.113 20:01:58 -- common/autotest_common.sh@946 -- # '[' -z 1605678 ']' 00:06:14.113 20:01:58 -- common/autotest_common.sh@950 -- # kill -0 1605678 00:06:14.113 20:01:58 -- common/autotest_common.sh@951 -- # uname 00:06:14.113 20:01:58 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:14.113 20:01:58 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1605678 00:06:14.113 20:01:58 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:14.113 20:01:58 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:14.113 20:01:58 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1605678' 00:06:14.113 killing process with pid 1605678 00:06:14.113 20:01:58 -- common/autotest_common.sh@965 -- # kill 1605678 00:06:14.113 20:01:58 -- common/autotest_common.sh@970 -- # wait 1605678 00:06:14.370 00:06:14.370 real 0m1.538s 00:06:14.370 user 0m1.589s 00:06:14.370 sys 0m0.492s 00:06:14.370 20:01:58 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:14.370 20:01:58 -- common/autotest_common.sh@10 -- # set +x 00:06:14.370 ************************************ 00:06:14.370 END TEST alias_rpc 00:06:14.370 ************************************ 00:06:14.370 20:01:58 -- spdk/autotest.sh@172 -- # [[ 0 -eq 0 ]] 00:06:14.370 20:01:58 -- spdk/autotest.sh@173 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:14.370 20:01:58 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:14.370 20:01:58 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:14.370 20:01:58 -- common/autotest_common.sh@10 -- # set +x 00:06:14.627 ************************************ 00:06:14.627 START TEST spdkcli_tcp 00:06:14.627 ************************************ 00:06:14.627 20:01:58 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:14.627 * Looking for test storage... 00:06:14.627 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:06:14.627 20:01:59 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:14.627 20:01:59 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:14.627 20:01:59 -- common/autotest_common.sh@720 -- # xtrace_disable 00:06:14.627 20:01:59 -- common/autotest_common.sh@10 -- # set +x 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1605957 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@27 -- # waitforlisten 1605957 00:06:14.627 20:01:59 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:14.627 20:01:59 -- common/autotest_common.sh@827 -- # '[' -z 1605957 ']' 00:06:14.627 20:01:59 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.627 20:01:59 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:14.627 20:01:59 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.627 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.627 20:01:59 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:14.627 20:01:59 -- common/autotest_common.sh@10 -- # set +x 00:06:14.627 [2024-04-26 20:01:59.064139] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:14.627 [2024-04-26 20:01:59.064211] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1605957 ] 00:06:14.884 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.884 [2024-04-26 20:01:59.148660] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:14.884 [2024-04-26 20:01:59.238639] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.884 [2024-04-26 20:01:59.238642] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.816 20:01:59 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:15.816 20:01:59 -- common/autotest_common.sh@860 -- # return 0 00:06:15.816 20:01:59 -- spdkcli/tcp.sh@31 -- # socat_pid=1606102 00:06:15.817 20:01:59 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:15.817 20:01:59 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:15.817 [ 00:06:15.817 "spdk_get_version", 00:06:15.817 "rpc_get_methods", 00:06:15.817 "trace_get_info", 00:06:15.817 "trace_get_tpoint_group_mask", 00:06:15.817 "trace_disable_tpoint_group", 00:06:15.817 "trace_enable_tpoint_group", 00:06:15.817 "trace_clear_tpoint_mask", 00:06:15.817 "trace_set_tpoint_mask", 00:06:15.817 "vfu_tgt_set_base_path", 00:06:15.817 "framework_get_pci_devices", 00:06:15.817 "framework_get_config", 00:06:15.817 "framework_get_subsystems", 00:06:15.817 "keyring_get_keys", 00:06:15.817 "iobuf_get_stats", 00:06:15.817 "iobuf_set_options", 00:06:15.817 "sock_get_default_impl", 00:06:15.817 "sock_set_default_impl", 00:06:15.817 "sock_impl_set_options", 00:06:15.817 "sock_impl_get_options", 00:06:15.817 "vmd_rescan", 00:06:15.817 "vmd_remove_device", 00:06:15.817 "vmd_enable", 00:06:15.817 "accel_get_stats", 00:06:15.817 "accel_set_options", 00:06:15.817 "accel_set_driver", 00:06:15.817 "accel_crypto_key_destroy", 00:06:15.817 "accel_crypto_keys_get", 00:06:15.817 "accel_crypto_key_create", 00:06:15.817 "accel_assign_opc", 00:06:15.817 "accel_get_module_info", 00:06:15.817 "accel_get_opc_assignments", 00:06:15.817 "notify_get_notifications", 00:06:15.817 "notify_get_types", 00:06:15.817 "bdev_get_histogram", 00:06:15.817 "bdev_enable_histogram", 00:06:15.817 "bdev_set_qos_limit", 00:06:15.817 "bdev_set_qd_sampling_period", 00:06:15.817 "bdev_get_bdevs", 00:06:15.817 "bdev_reset_iostat", 00:06:15.817 "bdev_get_iostat", 00:06:15.817 "bdev_examine", 00:06:15.817 "bdev_wait_for_examine", 00:06:15.817 "bdev_set_options", 00:06:15.817 "scsi_get_devices", 00:06:15.817 "thread_set_cpumask", 00:06:15.817 "framework_get_scheduler", 00:06:15.817 "framework_set_scheduler", 00:06:15.817 "framework_get_reactors", 00:06:15.817 "thread_get_io_channels", 00:06:15.817 "thread_get_pollers", 00:06:15.817 "thread_get_stats", 00:06:15.817 "framework_monitor_context_switch", 00:06:15.817 "spdk_kill_instance", 00:06:15.817 "log_enable_timestamps", 00:06:15.817 "log_get_flags", 00:06:15.817 "log_clear_flag", 00:06:15.817 "log_set_flag", 00:06:15.817 "log_get_level", 00:06:15.817 "log_set_level", 00:06:15.817 "log_get_print_level", 00:06:15.817 "log_set_print_level", 00:06:15.817 "framework_enable_cpumask_locks", 00:06:15.817 "framework_disable_cpumask_locks", 00:06:15.817 "framework_wait_init", 00:06:15.817 "framework_start_init", 00:06:15.817 "virtio_blk_create_transport", 00:06:15.817 "virtio_blk_get_transports", 00:06:15.817 "vhost_controller_set_coalescing", 00:06:15.817 "vhost_get_controllers", 00:06:15.817 "vhost_delete_controller", 00:06:15.817 "vhost_create_blk_controller", 00:06:15.817 "vhost_scsi_controller_remove_target", 00:06:15.817 "vhost_scsi_controller_add_target", 00:06:15.817 "vhost_start_scsi_controller", 00:06:15.817 "vhost_create_scsi_controller", 00:06:15.817 "ublk_recover_disk", 00:06:15.817 "ublk_get_disks", 00:06:15.817 "ublk_stop_disk", 00:06:15.817 "ublk_start_disk", 00:06:15.817 "ublk_destroy_target", 00:06:15.817 "ublk_create_target", 00:06:15.817 "nbd_get_disks", 00:06:15.817 "nbd_stop_disk", 00:06:15.817 "nbd_start_disk", 00:06:15.817 "env_dpdk_get_mem_stats", 00:06:15.817 "nvmf_subsystem_get_listeners", 00:06:15.817 "nvmf_subsystem_get_qpairs", 00:06:15.817 "nvmf_subsystem_get_controllers", 00:06:15.817 "nvmf_get_stats", 00:06:15.817 "nvmf_get_transports", 00:06:15.817 "nvmf_create_transport", 00:06:15.817 "nvmf_get_targets", 00:06:15.817 "nvmf_delete_target", 00:06:15.817 "nvmf_create_target", 00:06:15.817 "nvmf_subsystem_allow_any_host", 00:06:15.817 "nvmf_subsystem_remove_host", 00:06:15.817 "nvmf_subsystem_add_host", 00:06:15.817 "nvmf_ns_remove_host", 00:06:15.817 "nvmf_ns_add_host", 00:06:15.817 "nvmf_subsystem_remove_ns", 00:06:15.817 "nvmf_subsystem_add_ns", 00:06:15.817 "nvmf_subsystem_listener_set_ana_state", 00:06:15.817 "nvmf_discovery_get_referrals", 00:06:15.817 "nvmf_discovery_remove_referral", 00:06:15.817 "nvmf_discovery_add_referral", 00:06:15.817 "nvmf_subsystem_remove_listener", 00:06:15.817 "nvmf_subsystem_add_listener", 00:06:15.817 "nvmf_delete_subsystem", 00:06:15.817 "nvmf_create_subsystem", 00:06:15.817 "nvmf_get_subsystems", 00:06:15.817 "nvmf_set_crdt", 00:06:15.817 "nvmf_set_config", 00:06:15.817 "nvmf_set_max_subsystems", 00:06:15.817 "iscsi_get_histogram", 00:06:15.817 "iscsi_enable_histogram", 00:06:15.817 "iscsi_set_options", 00:06:15.817 "iscsi_get_auth_groups", 00:06:15.817 "iscsi_auth_group_remove_secret", 00:06:15.817 "iscsi_auth_group_add_secret", 00:06:15.817 "iscsi_delete_auth_group", 00:06:15.817 "iscsi_create_auth_group", 00:06:15.817 "iscsi_set_discovery_auth", 00:06:15.817 "iscsi_get_options", 00:06:15.817 "iscsi_target_node_request_logout", 00:06:15.817 "iscsi_target_node_set_redirect", 00:06:15.817 "iscsi_target_node_set_auth", 00:06:15.817 "iscsi_target_node_add_lun", 00:06:15.817 "iscsi_get_stats", 00:06:15.817 "iscsi_get_connections", 00:06:15.817 "iscsi_portal_group_set_auth", 00:06:15.817 "iscsi_start_portal_group", 00:06:15.817 "iscsi_delete_portal_group", 00:06:15.817 "iscsi_create_portal_group", 00:06:15.817 "iscsi_get_portal_groups", 00:06:15.817 "iscsi_delete_target_node", 00:06:15.817 "iscsi_target_node_remove_pg_ig_maps", 00:06:15.817 "iscsi_target_node_add_pg_ig_maps", 00:06:15.817 "iscsi_create_target_node", 00:06:15.817 "iscsi_get_target_nodes", 00:06:15.817 "iscsi_delete_initiator_group", 00:06:15.817 "iscsi_initiator_group_remove_initiators", 00:06:15.817 "iscsi_initiator_group_add_initiators", 00:06:15.817 "iscsi_create_initiator_group", 00:06:15.817 "iscsi_get_initiator_groups", 00:06:15.817 "keyring_file_remove_key", 00:06:15.817 "keyring_file_add_key", 00:06:15.817 "vfu_virtio_create_scsi_endpoint", 00:06:15.817 "vfu_virtio_scsi_remove_target", 00:06:15.817 "vfu_virtio_scsi_add_target", 00:06:15.817 "vfu_virtio_create_blk_endpoint", 00:06:15.817 "vfu_virtio_delete_endpoint", 00:06:15.817 "iaa_scan_accel_module", 00:06:15.817 "dsa_scan_accel_module", 00:06:15.817 "ioat_scan_accel_module", 00:06:15.817 "accel_error_inject_error", 00:06:15.817 "bdev_iscsi_delete", 00:06:15.817 "bdev_iscsi_create", 00:06:15.817 "bdev_iscsi_set_options", 00:06:15.817 "bdev_virtio_attach_controller", 00:06:15.817 "bdev_virtio_scsi_get_devices", 00:06:15.817 "bdev_virtio_detach_controller", 00:06:15.817 "bdev_virtio_blk_set_hotplug", 00:06:15.817 "bdev_ftl_set_property", 00:06:15.817 "bdev_ftl_get_properties", 00:06:15.817 "bdev_ftl_get_stats", 00:06:15.817 "bdev_ftl_unmap", 00:06:15.817 "bdev_ftl_unload", 00:06:15.817 "bdev_ftl_delete", 00:06:15.817 "bdev_ftl_load", 00:06:15.817 "bdev_ftl_create", 00:06:15.817 "bdev_aio_delete", 00:06:15.817 "bdev_aio_rescan", 00:06:15.817 "bdev_aio_create", 00:06:15.817 "blobfs_create", 00:06:15.817 "blobfs_detect", 00:06:15.817 "blobfs_set_cache_size", 00:06:15.817 "bdev_zone_block_delete", 00:06:15.817 "bdev_zone_block_create", 00:06:15.817 "bdev_delay_delete", 00:06:15.817 "bdev_delay_create", 00:06:15.817 "bdev_delay_update_latency", 00:06:15.817 "bdev_split_delete", 00:06:15.817 "bdev_split_create", 00:06:15.817 "bdev_error_inject_error", 00:06:15.817 "bdev_error_delete", 00:06:15.817 "bdev_error_create", 00:06:15.817 "bdev_raid_set_options", 00:06:15.817 "bdev_raid_remove_base_bdev", 00:06:15.817 "bdev_raid_add_base_bdev", 00:06:15.817 "bdev_raid_delete", 00:06:15.817 "bdev_raid_create", 00:06:15.817 "bdev_raid_get_bdevs", 00:06:15.817 "bdev_lvol_grow_lvstore", 00:06:15.817 "bdev_lvol_get_lvols", 00:06:15.817 "bdev_lvol_get_lvstores", 00:06:15.817 "bdev_lvol_delete", 00:06:15.817 "bdev_lvol_set_read_only", 00:06:15.817 "bdev_lvol_resize", 00:06:15.817 "bdev_lvol_decouple_parent", 00:06:15.817 "bdev_lvol_inflate", 00:06:15.817 "bdev_lvol_rename", 00:06:15.817 "bdev_lvol_clone_bdev", 00:06:15.817 "bdev_lvol_clone", 00:06:15.817 "bdev_lvol_snapshot", 00:06:15.817 "bdev_lvol_create", 00:06:15.817 "bdev_lvol_delete_lvstore", 00:06:15.817 "bdev_lvol_rename_lvstore", 00:06:15.817 "bdev_lvol_create_lvstore", 00:06:15.817 "bdev_passthru_delete", 00:06:15.817 "bdev_passthru_create", 00:06:15.817 "bdev_nvme_cuse_unregister", 00:06:15.817 "bdev_nvme_cuse_register", 00:06:15.817 "bdev_opal_new_user", 00:06:15.817 "bdev_opal_set_lock_state", 00:06:15.817 "bdev_opal_delete", 00:06:15.817 "bdev_opal_get_info", 00:06:15.817 "bdev_opal_create", 00:06:15.817 "bdev_nvme_opal_revert", 00:06:15.817 "bdev_nvme_opal_init", 00:06:15.817 "bdev_nvme_send_cmd", 00:06:15.817 "bdev_nvme_get_path_iostat", 00:06:15.817 "bdev_nvme_get_mdns_discovery_info", 00:06:15.817 "bdev_nvme_stop_mdns_discovery", 00:06:15.817 "bdev_nvme_start_mdns_discovery", 00:06:15.817 "bdev_nvme_set_multipath_policy", 00:06:15.817 "bdev_nvme_set_preferred_path", 00:06:15.817 "bdev_nvme_get_io_paths", 00:06:15.817 "bdev_nvme_remove_error_injection", 00:06:15.817 "bdev_nvme_add_error_injection", 00:06:15.817 "bdev_nvme_get_discovery_info", 00:06:15.817 "bdev_nvme_stop_discovery", 00:06:15.817 "bdev_nvme_start_discovery", 00:06:15.817 "bdev_nvme_get_controller_health_info", 00:06:15.817 "bdev_nvme_disable_controller", 00:06:15.817 "bdev_nvme_enable_controller", 00:06:15.817 "bdev_nvme_reset_controller", 00:06:15.817 "bdev_nvme_get_transport_statistics", 00:06:15.817 "bdev_nvme_apply_firmware", 00:06:15.817 "bdev_nvme_detach_controller", 00:06:15.817 "bdev_nvme_get_controllers", 00:06:15.817 "bdev_nvme_attach_controller", 00:06:15.817 "bdev_nvme_set_hotplug", 00:06:15.817 "bdev_nvme_set_options", 00:06:15.817 "bdev_null_resize", 00:06:15.817 "bdev_null_delete", 00:06:15.817 "bdev_null_create", 00:06:15.817 "bdev_malloc_delete", 00:06:15.817 "bdev_malloc_create" 00:06:15.817 ] 00:06:15.817 20:02:00 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:15.817 20:02:00 -- common/autotest_common.sh@726 -- # xtrace_disable 00:06:15.817 20:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:15.817 20:02:00 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:15.817 20:02:00 -- spdkcli/tcp.sh@38 -- # killprocess 1605957 00:06:15.817 20:02:00 -- common/autotest_common.sh@946 -- # '[' -z 1605957 ']' 00:06:15.817 20:02:00 -- common/autotest_common.sh@950 -- # kill -0 1605957 00:06:15.817 20:02:00 -- common/autotest_common.sh@951 -- # uname 00:06:15.817 20:02:00 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:15.817 20:02:00 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1605957 00:06:15.817 20:02:00 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:15.817 20:02:00 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:15.817 20:02:00 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1605957' 00:06:15.817 killing process with pid 1605957 00:06:15.817 20:02:00 -- common/autotest_common.sh@965 -- # kill 1605957 00:06:15.817 20:02:00 -- common/autotest_common.sh@970 -- # wait 1605957 00:06:16.075 00:06:16.075 real 0m1.592s 00:06:16.075 user 0m2.851s 00:06:16.075 sys 0m0.544s 00:06:16.075 20:02:00 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:16.075 20:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:16.075 ************************************ 00:06:16.075 END TEST spdkcli_tcp 00:06:16.075 ************************************ 00:06:16.333 20:02:00 -- spdk/autotest.sh@176 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:16.333 20:02:00 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:16.333 20:02:00 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:16.333 20:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:16.333 ************************************ 00:06:16.333 START TEST dpdk_mem_utility 00:06:16.333 ************************************ 00:06:16.333 20:02:00 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:16.592 * Looking for test storage... 00:06:16.592 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:06:16.592 20:02:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:16.592 20:02:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1606342 00:06:16.592 20:02:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:06:16.592 20:02:00 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1606342 00:06:16.592 20:02:00 -- common/autotest_common.sh@827 -- # '[' -z 1606342 ']' 00:06:16.592 20:02:00 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:16.592 20:02:00 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:16.592 20:02:00 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:16.592 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:16.592 20:02:00 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:16.592 20:02:00 -- common/autotest_common.sh@10 -- # set +x 00:06:16.592 [2024-04-26 20:02:00.851183] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:16.592 [2024-04-26 20:02:00.851242] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606342 ] 00:06:16.592 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.592 [2024-04-26 20:02:00.935004] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:16.592 [2024-04-26 20:02:01.014046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:17.529 20:02:01 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:17.529 20:02:01 -- common/autotest_common.sh@860 -- # return 0 00:06:17.529 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:17.529 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:17.529 20:02:01 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:17.529 20:02:01 -- common/autotest_common.sh@10 -- # set +x 00:06:17.529 { 00:06:17.529 "filename": "/tmp/spdk_mem_dump.txt" 00:06:17.529 } 00:06:17.529 20:02:01 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:17.529 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:17.529 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:17.529 1 heaps totaling size 814.000000 MiB 00:06:17.529 size: 814.000000 MiB heap id: 0 00:06:17.529 end heaps---------- 00:06:17.529 8 mempools totaling size 598.116089 MiB 00:06:17.529 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:17.529 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:17.529 size: 84.521057 MiB name: bdev_io_1606342 00:06:17.529 size: 51.011292 MiB name: evtpool_1606342 00:06:17.529 size: 50.003479 MiB name: msgpool_1606342 00:06:17.529 size: 21.763794 MiB name: PDU_Pool 00:06:17.529 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:17.529 size: 0.026123 MiB name: Session_Pool 00:06:17.529 end mempools------- 00:06:17.529 6 memzones totaling size 4.142822 MiB 00:06:17.529 size: 1.000366 MiB name: RG_ring_0_1606342 00:06:17.529 size: 1.000366 MiB name: RG_ring_1_1606342 00:06:17.529 size: 1.000366 MiB name: RG_ring_4_1606342 00:06:17.529 size: 1.000366 MiB name: RG_ring_5_1606342 00:06:17.529 size: 0.125366 MiB name: RG_ring_2_1606342 00:06:17.529 size: 0.015991 MiB name: RG_ring_3_1606342 00:06:17.529 end memzones------- 00:06:17.530 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:17.530 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:06:17.530 list of free elements. size: 12.519348 MiB 00:06:17.530 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:17.530 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:17.530 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:17.530 element at address: 0x200003e00000 with size: 0.996277 MiB 00:06:17.530 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:17.530 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:17.530 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:17.530 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:17.530 element at address: 0x200000200000 with size: 0.841614 MiB 00:06:17.530 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:06:17.530 element at address: 0x20000b200000 with size: 0.490723 MiB 00:06:17.530 element at address: 0x200000800000 with size: 0.487793 MiB 00:06:17.530 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:17.530 element at address: 0x200027e00000 with size: 0.410034 MiB 00:06:17.530 element at address: 0x200003a00000 with size: 0.355530 MiB 00:06:17.530 list of standard malloc elements. size: 199.218079 MiB 00:06:17.530 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:17.530 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:17.530 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:17.530 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:17.530 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:17.530 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:17.530 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:17.530 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:17.530 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:17.530 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003adb300 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003adb500 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003affa80 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003affb40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:17.530 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:17.530 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200027e69040 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:17.530 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:17.530 list of memzone associated elements. size: 602.262573 MiB 00:06:17.530 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:17.530 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:17.530 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:17.530 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:17.530 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:17.530 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1606342_0 00:06:17.530 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:17.530 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1606342_0 00:06:17.530 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:17.530 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1606342_0 00:06:17.530 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:17.530 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:17.530 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:17.530 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:17.530 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:17.530 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1606342 00:06:17.530 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:17.530 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1606342 00:06:17.530 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:06:17.530 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1606342 00:06:17.530 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:17.530 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:17.530 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:17.530 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:17.530 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:17.530 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:17.530 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:17.530 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:17.530 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:17.530 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1606342 00:06:17.530 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:17.530 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1606342 00:06:17.530 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:17.530 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1606342 00:06:17.530 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:17.530 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1606342 00:06:17.530 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:06:17.530 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1606342 00:06:17.530 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:17.530 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:17.530 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:17.530 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:17.530 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:17.530 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:17.530 element at address: 0x200003adf880 with size: 0.125488 MiB 00:06:17.530 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1606342 00:06:17.530 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:17.530 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:17.530 element at address: 0x200027e69100 with size: 0.023743 MiB 00:06:17.530 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:17.530 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:06:17.530 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1606342 00:06:17.530 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:06:17.530 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:17.530 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:06:17.530 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1606342 00:06:17.530 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:06:17.530 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1606342 00:06:17.530 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:06:17.530 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:17.530 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:17.530 20:02:01 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1606342 00:06:17.530 20:02:01 -- common/autotest_common.sh@946 -- # '[' -z 1606342 ']' 00:06:17.530 20:02:01 -- common/autotest_common.sh@950 -- # kill -0 1606342 00:06:17.530 20:02:01 -- common/autotest_common.sh@951 -- # uname 00:06:17.530 20:02:01 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:17.530 20:02:01 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1606342 00:06:17.530 20:02:01 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:17.530 20:02:01 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:17.530 20:02:01 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1606342' 00:06:17.530 killing process with pid 1606342 00:06:17.530 20:02:01 -- common/autotest_common.sh@965 -- # kill 1606342 00:06:17.530 20:02:01 -- common/autotest_common.sh@970 -- # wait 1606342 00:06:17.789 00:06:17.789 real 0m1.457s 00:06:17.789 user 0m1.493s 00:06:17.789 sys 0m0.438s 00:06:17.789 20:02:02 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:17.789 20:02:02 -- common/autotest_common.sh@10 -- # set +x 00:06:17.789 ************************************ 00:06:17.789 END TEST dpdk_mem_utility 00:06:17.789 ************************************ 00:06:17.789 20:02:02 -- spdk/autotest.sh@177 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:17.789 20:02:02 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:17.789 20:02:02 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:17.789 20:02:02 -- common/autotest_common.sh@10 -- # set +x 00:06:18.047 ************************************ 00:06:18.047 START TEST event 00:06:18.047 ************************************ 00:06:18.048 20:02:02 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:06:18.048 * Looking for test storage... 00:06:18.048 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:18.048 20:02:02 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:18.048 20:02:02 -- bdev/nbd_common.sh@6 -- # set -e 00:06:18.048 20:02:02 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:18.048 20:02:02 -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:06:18.048 20:02:02 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:18.048 20:02:02 -- common/autotest_common.sh@10 -- # set +x 00:06:18.306 ************************************ 00:06:18.306 START TEST event_perf 00:06:18.306 ************************************ 00:06:18.306 20:02:02 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:18.306 Running I/O for 1 seconds...[2024-04-26 20:02:02.637956] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:18.306 [2024-04-26 20:02:02.638038] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606595 ] 00:06:18.306 EAL: No free 2048 kB hugepages reported on node 1 00:06:18.306 [2024-04-26 20:02:02.724956] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:18.564 [2024-04-26 20:02:02.809443] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.564 [2024-04-26 20:02:02.809536] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:18.564 [2024-04-26 20:02:02.809558] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:18.564 [2024-04-26 20:02:02.809560] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.501 Running I/O for 1 seconds... 00:06:19.501 lcore 0: 191341 00:06:19.501 lcore 1: 191339 00:06:19.501 lcore 2: 191340 00:06:19.501 lcore 3: 191341 00:06:19.501 done. 00:06:19.501 00:06:19.501 real 0m1.267s 00:06:19.501 user 0m4.151s 00:06:19.501 sys 0m0.110s 00:06:19.501 20:02:03 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:19.501 20:02:03 -- common/autotest_common.sh@10 -- # set +x 00:06:19.501 ************************************ 00:06:19.501 END TEST event_perf 00:06:19.501 ************************************ 00:06:19.501 20:02:03 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:19.501 20:02:03 -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:19.501 20:02:03 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:19.501 20:02:03 -- common/autotest_common.sh@10 -- # set +x 00:06:19.759 ************************************ 00:06:19.759 START TEST event_reactor 00:06:19.759 ************************************ 00:06:19.759 20:02:04 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:19.759 [2024-04-26 20:02:04.105802] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:19.759 [2024-04-26 20:02:04.105933] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1606805 ] 00:06:19.759 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.759 [2024-04-26 20:02:04.191882] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.018 [2024-04-26 20:02:04.278445] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.952 test_start 00:06:20.952 oneshot 00:06:20.952 tick 100 00:06:20.952 tick 100 00:06:20.952 tick 250 00:06:20.952 tick 100 00:06:20.952 tick 100 00:06:20.952 tick 100 00:06:20.952 tick 250 00:06:20.952 tick 500 00:06:20.952 tick 100 00:06:20.952 tick 100 00:06:20.952 tick 250 00:06:20.952 tick 100 00:06:20.952 tick 100 00:06:20.952 test_end 00:06:20.952 00:06:20.952 real 0m1.263s 00:06:20.952 user 0m1.160s 00:06:20.952 sys 0m0.099s 00:06:20.952 20:02:05 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:20.952 20:02:05 -- common/autotest_common.sh@10 -- # set +x 00:06:20.952 ************************************ 00:06:20.952 END TEST event_reactor 00:06:20.952 ************************************ 00:06:20.952 20:02:05 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:20.952 20:02:05 -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:06:20.952 20:02:05 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:20.952 20:02:05 -- common/autotest_common.sh@10 -- # set +x 00:06:21.209 ************************************ 00:06:21.209 START TEST event_reactor_perf 00:06:21.209 ************************************ 00:06:21.209 20:02:05 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:21.209 [2024-04-26 20:02:05.578653] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:21.209 [2024-04-26 20:02:05.578740] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607003 ] 00:06:21.209 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.468 [2024-04-26 20:02:05.665421] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:21.468 [2024-04-26 20:02:05.753241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.405 test_start 00:06:22.405 test_end 00:06:22.405 Performance: 914856 events per second 00:06:22.405 00:06:22.405 real 0m1.268s 00:06:22.405 user 0m1.162s 00:06:22.405 sys 0m0.102s 00:06:22.405 20:02:06 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:22.405 20:02:06 -- common/autotest_common.sh@10 -- # set +x 00:06:22.405 ************************************ 00:06:22.405 END TEST event_reactor_perf 00:06:22.405 ************************************ 00:06:22.663 20:02:06 -- event/event.sh@49 -- # uname -s 00:06:22.663 20:02:06 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:22.663 20:02:06 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:22.663 20:02:06 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:22.663 20:02:06 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:22.664 20:02:06 -- common/autotest_common.sh@10 -- # set +x 00:06:22.664 ************************************ 00:06:22.664 START TEST event_scheduler 00:06:22.664 ************************************ 00:06:22.664 20:02:07 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:22.924 * Looking for test storage... 00:06:22.924 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:06:22.924 20:02:07 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:22.924 20:02:07 -- scheduler/scheduler.sh@35 -- # scheduler_pid=1607278 00:06:22.924 20:02:07 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:22.924 20:02:07 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:22.924 20:02:07 -- scheduler/scheduler.sh@37 -- # waitforlisten 1607278 00:06:22.924 20:02:07 -- common/autotest_common.sh@827 -- # '[' -z 1607278 ']' 00:06:22.924 20:02:07 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:22.924 20:02:07 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:22.924 20:02:07 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:22.924 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:22.924 20:02:07 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:22.924 20:02:07 -- common/autotest_common.sh@10 -- # set +x 00:06:22.924 [2024-04-26 20:02:07.151505] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:22.924 [2024-04-26 20:02:07.151568] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607278 ] 00:06:22.924 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.924 [2024-04-26 20:02:07.232981] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:22.924 [2024-04-26 20:02:07.315386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.924 [2024-04-26 20:02:07.315461] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:22.924 [2024-04-26 20:02:07.315538] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:22.924 [2024-04-26 20:02:07.315542] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:23.860 20:02:08 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:23.860 20:02:08 -- common/autotest_common.sh@860 -- # return 0 00:06:23.860 20:02:08 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:23.860 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:23.860 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.860 POWER: Env isn't set yet! 00:06:23.860 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:23.861 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:23.861 POWER: Cannot set governor of lcore 0 to userspace 00:06:23.861 POWER: Attempting to initialise PSTAT power management... 00:06:23.861 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:23.861 POWER: Initialized successfully for lcore 0 power management 00:06:23.861 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:23.861 POWER: Initialized successfully for lcore 1 power management 00:06:23.861 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:23.861 POWER: Initialized successfully for lcore 2 power management 00:06:23.861 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:23.861 POWER: Initialized successfully for lcore 3 power management 00:06:23.861 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:23.861 20:02:08 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:23.861 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:23.861 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.861 [2024-04-26 20:02:08.123895] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:23.861 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:23.861 20:02:08 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:23.861 20:02:08 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:23.861 20:02:08 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:23.861 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.861 ************************************ 00:06:23.861 START TEST scheduler_create_thread 00:06:23.861 ************************************ 00:06:23.861 20:02:08 -- common/autotest_common.sh@1121 -- # scheduler_create_thread 00:06:23.861 20:02:08 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:23.861 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:23.861 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:23.861 2 00:06:23.861 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:23.861 20:02:08 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:23.861 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:23.861 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 3 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 4 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 5 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 6 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 7 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 8 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 9 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 10 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:24.120 20:02:08 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:24.120 20:02:08 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:24.120 20:02:08 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:24.120 20:02:08 -- common/autotest_common.sh@10 -- # set +x 00:06:25.058 20:02:09 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:25.058 20:02:09 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:25.058 20:02:09 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:25.058 20:02:09 -- common/autotest_common.sh@10 -- # set +x 00:06:26.494 20:02:10 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:26.494 20:02:10 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:26.494 20:02:10 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:26.494 20:02:10 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:26.494 20:02:10 -- common/autotest_common.sh@10 -- # set +x 00:06:27.430 20:02:11 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:27.430 00:06:27.430 real 0m3.484s 00:06:27.430 user 0m0.021s 00:06:27.430 sys 0m0.010s 00:06:27.430 20:02:11 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:27.430 20:02:11 -- common/autotest_common.sh@10 -- # set +x 00:06:27.430 ************************************ 00:06:27.430 END TEST scheduler_create_thread 00:06:27.430 ************************************ 00:06:27.430 20:02:11 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:27.430 20:02:11 -- scheduler/scheduler.sh@46 -- # killprocess 1607278 00:06:27.430 20:02:11 -- common/autotest_common.sh@946 -- # '[' -z 1607278 ']' 00:06:27.430 20:02:11 -- common/autotest_common.sh@950 -- # kill -0 1607278 00:06:27.430 20:02:11 -- common/autotest_common.sh@951 -- # uname 00:06:27.430 20:02:11 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:27.430 20:02:11 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1607278 00:06:27.430 20:02:11 -- common/autotest_common.sh@952 -- # process_name=reactor_2 00:06:27.430 20:02:11 -- common/autotest_common.sh@956 -- # '[' reactor_2 = sudo ']' 00:06:27.430 20:02:11 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1607278' 00:06:27.430 killing process with pid 1607278 00:06:27.430 20:02:11 -- common/autotest_common.sh@965 -- # kill 1607278 00:06:27.430 20:02:11 -- common/autotest_common.sh@970 -- # wait 1607278 00:06:27.998 [2024-04-26 20:02:12.249334] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:27.998 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:27.998 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:27.998 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:27.998 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:27.998 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:27.998 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:27.998 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:27.998 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:28.257 00:06:28.257 real 0m5.412s 00:06:28.257 user 0m8.877s 00:06:28.257 sys 0m0.548s 00:06:28.257 20:02:12 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:28.257 20:02:12 -- common/autotest_common.sh@10 -- # set +x 00:06:28.257 ************************************ 00:06:28.257 END TEST event_scheduler 00:06:28.257 ************************************ 00:06:28.257 20:02:12 -- event/event.sh@51 -- # modprobe -n nbd 00:06:28.257 20:02:12 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:28.257 20:02:12 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:28.257 20:02:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:28.257 20:02:12 -- common/autotest_common.sh@10 -- # set +x 00:06:28.257 ************************************ 00:06:28.257 START TEST app_repeat 00:06:28.257 ************************************ 00:06:28.257 20:02:12 -- common/autotest_common.sh@1121 -- # app_repeat_test 00:06:28.257 20:02:12 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:28.257 20:02:12 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:28.257 20:02:12 -- event/event.sh@13 -- # local nbd_list 00:06:28.257 20:02:12 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:28.257 20:02:12 -- event/event.sh@14 -- # local bdev_list 00:06:28.257 20:02:12 -- event/event.sh@15 -- # local repeat_times=4 00:06:28.257 20:02:12 -- event/event.sh@17 -- # modprobe nbd 00:06:28.257 20:02:12 -- event/event.sh@19 -- # repeat_pid=1608116 00:06:28.257 20:02:12 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:28.257 20:02:12 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:28.257 20:02:12 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1608116' 00:06:28.257 Process app_repeat pid: 1608116 00:06:28.257 20:02:12 -- event/event.sh@23 -- # for i in {0..2} 00:06:28.257 20:02:12 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:28.257 spdk_app_start Round 0 00:06:28.257 20:02:12 -- event/event.sh@25 -- # waitforlisten 1608116 /var/tmp/spdk-nbd.sock 00:06:28.257 20:02:12 -- common/autotest_common.sh@827 -- # '[' -z 1608116 ']' 00:06:28.257 20:02:12 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:28.257 20:02:12 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:28.257 20:02:12 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:28.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:28.257 20:02:12 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:28.257 20:02:12 -- common/autotest_common.sh@10 -- # set +x 00:06:28.257 [2024-04-26 20:02:12.676203] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:28.257 [2024-04-26 20:02:12.676285] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1608116 ] 00:06:28.514 EAL: No free 2048 kB hugepages reported on node 1 00:06:28.514 [2024-04-26 20:02:12.762827] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:28.514 [2024-04-26 20:02:12.853111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:28.514 [2024-04-26 20:02:12.853114] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.081 20:02:13 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:29.082 20:02:13 -- common/autotest_common.sh@860 -- # return 0 00:06:29.082 20:02:13 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:29.341 Malloc0 00:06:29.341 20:02:13 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:29.600 Malloc1 00:06:29.600 20:02:13 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@12 -- # local i 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:29.600 20:02:13 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:29.859 /dev/nbd0 00:06:29.859 20:02:14 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:29.859 20:02:14 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:29.859 20:02:14 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:29.859 20:02:14 -- common/autotest_common.sh@865 -- # local i 00:06:29.859 20:02:14 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:29.859 20:02:14 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:29.859 20:02:14 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:29.859 20:02:14 -- common/autotest_common.sh@869 -- # break 00:06:29.859 20:02:14 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:29.859 20:02:14 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:29.859 20:02:14 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:29.859 1+0 records in 00:06:29.859 1+0 records out 00:06:29.859 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235502 s, 17.4 MB/s 00:06:29.859 20:02:14 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:29.859 20:02:14 -- common/autotest_common.sh@882 -- # size=4096 00:06:29.859 20:02:14 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:29.859 20:02:14 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:29.859 20:02:14 -- common/autotest_common.sh@885 -- # return 0 00:06:29.859 20:02:14 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:29.859 20:02:14 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:29.859 20:02:14 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:30.119 /dev/nbd1 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:30.119 20:02:14 -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:30.119 20:02:14 -- common/autotest_common.sh@865 -- # local i 00:06:30.119 20:02:14 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:30.119 20:02:14 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:30.119 20:02:14 -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:30.119 20:02:14 -- common/autotest_common.sh@869 -- # break 00:06:30.119 20:02:14 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:30.119 20:02:14 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:30.119 20:02:14 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:30.119 1+0 records in 00:06:30.119 1+0 records out 00:06:30.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225078 s, 18.2 MB/s 00:06:30.119 20:02:14 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.119 20:02:14 -- common/autotest_common.sh@882 -- # size=4096 00:06:30.119 20:02:14 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:30.119 20:02:14 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:30.119 20:02:14 -- common/autotest_common.sh@885 -- # return 0 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:30.119 { 00:06:30.119 "nbd_device": "/dev/nbd0", 00:06:30.119 "bdev_name": "Malloc0" 00:06:30.119 }, 00:06:30.119 { 00:06:30.119 "nbd_device": "/dev/nbd1", 00:06:30.119 "bdev_name": "Malloc1" 00:06:30.119 } 00:06:30.119 ]' 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:30.119 { 00:06:30.119 "nbd_device": "/dev/nbd0", 00:06:30.119 "bdev_name": "Malloc0" 00:06:30.119 }, 00:06:30.119 { 00:06:30.119 "nbd_device": "/dev/nbd1", 00:06:30.119 "bdev_name": "Malloc1" 00:06:30.119 } 00:06:30.119 ]' 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:30.119 /dev/nbd1' 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:30.119 /dev/nbd1' 00:06:30.119 20:02:14 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@65 -- # count=2 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@95 -- # count=2 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:30.379 256+0 records in 00:06:30.379 256+0 records out 00:06:30.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108694 s, 96.5 MB/s 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:30.379 256+0 records in 00:06:30.379 256+0 records out 00:06:30.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210841 s, 49.7 MB/s 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:30.379 256+0 records in 00:06:30.379 256+0 records out 00:06:30.379 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0223193 s, 47.0 MB/s 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@51 -- # local i 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.379 20:02:14 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@41 -- # break 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@45 -- # return 0 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:30.638 20:02:14 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@41 -- # break 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@45 -- # return 0 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:30.638 20:02:15 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@65 -- # true 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@65 -- # count=0 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@104 -- # count=0 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:30.897 20:02:15 -- bdev/nbd_common.sh@109 -- # return 0 00:06:30.897 20:02:15 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:31.156 20:02:15 -- event/event.sh@35 -- # sleep 3 00:06:31.415 [2024-04-26 20:02:15.655347] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:31.415 [2024-04-26 20:02:15.733774] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:31.415 [2024-04-26 20:02:15.733777] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.415 [2024-04-26 20:02:15.779559] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:31.415 [2024-04-26 20:02:15.779606] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:34.702 20:02:18 -- event/event.sh@23 -- # for i in {0..2} 00:06:34.702 20:02:18 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:34.702 spdk_app_start Round 1 00:06:34.702 20:02:18 -- event/event.sh@25 -- # waitforlisten 1608116 /var/tmp/spdk-nbd.sock 00:06:34.702 20:02:18 -- common/autotest_common.sh@827 -- # '[' -z 1608116 ']' 00:06:34.702 20:02:18 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:34.702 20:02:18 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:34.702 20:02:18 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:34.702 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:34.702 20:02:18 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:34.702 20:02:18 -- common/autotest_common.sh@10 -- # set +x 00:06:34.702 20:02:18 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:34.702 20:02:18 -- common/autotest_common.sh@860 -- # return 0 00:06:34.702 20:02:18 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:34.702 Malloc0 00:06:34.702 20:02:18 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:34.702 Malloc1 00:06:34.702 20:02:18 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@12 -- # local i 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:34.702 20:02:18 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:34.960 /dev/nbd0 00:06:34.960 20:02:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:34.960 20:02:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:34.960 20:02:19 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:34.960 20:02:19 -- common/autotest_common.sh@865 -- # local i 00:06:34.960 20:02:19 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:34.960 20:02:19 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:34.960 20:02:19 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:34.960 20:02:19 -- common/autotest_common.sh@869 -- # break 00:06:34.960 20:02:19 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:34.960 20:02:19 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:34.960 20:02:19 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:34.960 1+0 records in 00:06:34.960 1+0 records out 00:06:34.960 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237341 s, 17.3 MB/s 00:06:34.961 20:02:19 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:34.961 20:02:19 -- common/autotest_common.sh@882 -- # size=4096 00:06:34.961 20:02:19 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:34.961 20:02:19 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:34.961 20:02:19 -- common/autotest_common.sh@885 -- # return 0 00:06:34.961 20:02:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:34.961 20:02:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:34.961 20:02:19 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:34.961 /dev/nbd1 00:06:34.961 20:02:19 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:34.961 20:02:19 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:34.961 20:02:19 -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:34.961 20:02:19 -- common/autotest_common.sh@865 -- # local i 00:06:34.961 20:02:19 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:34.961 20:02:19 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:34.961 20:02:19 -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:34.961 20:02:19 -- common/autotest_common.sh@869 -- # break 00:06:34.961 20:02:19 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:34.961 20:02:19 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:34.961 20:02:19 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:34.961 1+0 records in 00:06:34.961 1+0 records out 00:06:34.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253857 s, 16.1 MB/s 00:06:34.961 20:02:19 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.220 20:02:19 -- common/autotest_common.sh@882 -- # size=4096 00:06:35.220 20:02:19 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:35.220 20:02:19 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:35.220 20:02:19 -- common/autotest_common.sh@885 -- # return 0 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:35.220 { 00:06:35.220 "nbd_device": "/dev/nbd0", 00:06:35.220 "bdev_name": "Malloc0" 00:06:35.220 }, 00:06:35.220 { 00:06:35.220 "nbd_device": "/dev/nbd1", 00:06:35.220 "bdev_name": "Malloc1" 00:06:35.220 } 00:06:35.220 ]' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:35.220 { 00:06:35.220 "nbd_device": "/dev/nbd0", 00:06:35.220 "bdev_name": "Malloc0" 00:06:35.220 }, 00:06:35.220 { 00:06:35.220 "nbd_device": "/dev/nbd1", 00:06:35.220 "bdev_name": "Malloc1" 00:06:35.220 } 00:06:35.220 ]' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:35.220 /dev/nbd1' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:35.220 /dev/nbd1' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@65 -- # count=2 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@95 -- # count=2 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:35.220 256+0 records in 00:06:35.220 256+0 records out 00:06:35.220 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114717 s, 91.4 MB/s 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.220 20:02:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:35.479 256+0 records in 00:06:35.479 256+0 records out 00:06:35.479 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210649 s, 49.8 MB/s 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:35.479 256+0 records in 00:06:35.479 256+0 records out 00:06:35.479 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0224206 s, 46.8 MB/s 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@51 -- # local i 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:35.479 20:02:19 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@41 -- # break 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@45 -- # return 0 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:35.738 20:02:19 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@41 -- # break 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@45 -- # return 0 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:35.738 20:02:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@65 -- # true 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@65 -- # count=0 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@104 -- # count=0 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:35.996 20:02:20 -- bdev/nbd_common.sh@109 -- # return 0 00:06:35.996 20:02:20 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:36.256 20:02:20 -- event/event.sh@35 -- # sleep 3 00:06:36.515 [2024-04-26 20:02:20.765574] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:36.515 [2024-04-26 20:02:20.847619] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:36.515 [2024-04-26 20:02:20.847621] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:36.515 [2024-04-26 20:02:20.894267] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:36.515 [2024-04-26 20:02:20.894312] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:39.800 20:02:23 -- event/event.sh@23 -- # for i in {0..2} 00:06:39.800 20:02:23 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:39.800 spdk_app_start Round 2 00:06:39.800 20:02:23 -- event/event.sh@25 -- # waitforlisten 1608116 /var/tmp/spdk-nbd.sock 00:06:39.800 20:02:23 -- common/autotest_common.sh@827 -- # '[' -z 1608116 ']' 00:06:39.800 20:02:23 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:39.800 20:02:23 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:39.800 20:02:23 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:39.800 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:39.800 20:02:23 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:39.800 20:02:23 -- common/autotest_common.sh@10 -- # set +x 00:06:39.800 20:02:23 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:39.800 20:02:23 -- common/autotest_common.sh@860 -- # return 0 00:06:39.800 20:02:23 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:39.800 Malloc0 00:06:39.800 20:02:23 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:39.800 Malloc1 00:06:39.800 20:02:24 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@12 -- # local i 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:39.800 20:02:24 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:40.059 /dev/nbd0 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:40.059 20:02:24 -- common/autotest_common.sh@864 -- # local nbd_name=nbd0 00:06:40.059 20:02:24 -- common/autotest_common.sh@865 -- # local i 00:06:40.059 20:02:24 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@868 -- # grep -q -w nbd0 /proc/partitions 00:06:40.059 20:02:24 -- common/autotest_common.sh@869 -- # break 00:06:40.059 20:02:24 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:40.059 1+0 records in 00:06:40.059 1+0 records out 00:06:40.059 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225676 s, 18.1 MB/s 00:06:40.059 20:02:24 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.059 20:02:24 -- common/autotest_common.sh@882 -- # size=4096 00:06:40.059 20:02:24 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.059 20:02:24 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.059 20:02:24 -- common/autotest_common.sh@885 -- # return 0 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:40.059 /dev/nbd1 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:40.059 20:02:24 -- common/autotest_common.sh@864 -- # local nbd_name=nbd1 00:06:40.059 20:02:24 -- common/autotest_common.sh@865 -- # local i 00:06:40.059 20:02:24 -- common/autotest_common.sh@867 -- # (( i = 1 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@867 -- # (( i <= 20 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@868 -- # grep -q -w nbd1 /proc/partitions 00:06:40.059 20:02:24 -- common/autotest_common.sh@869 -- # break 00:06:40.059 20:02:24 -- common/autotest_common.sh@880 -- # (( i = 1 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@880 -- # (( i <= 20 )) 00:06:40.059 20:02:24 -- common/autotest_common.sh@881 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:40.059 1+0 records in 00:06:40.059 1+0 records out 00:06:40.059 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250293 s, 16.4 MB/s 00:06:40.059 20:02:24 -- common/autotest_common.sh@882 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.059 20:02:24 -- common/autotest_common.sh@882 -- # size=4096 00:06:40.059 20:02:24 -- common/autotest_common.sh@883 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:06:40.059 20:02:24 -- common/autotest_common.sh@884 -- # '[' 4096 '!=' 0 ']' 00:06:40.059 20:02:24 -- common/autotest_common.sh@885 -- # return 0 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:40.059 20:02:24 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:40.318 { 00:06:40.318 "nbd_device": "/dev/nbd0", 00:06:40.318 "bdev_name": "Malloc0" 00:06:40.318 }, 00:06:40.318 { 00:06:40.318 "nbd_device": "/dev/nbd1", 00:06:40.318 "bdev_name": "Malloc1" 00:06:40.318 } 00:06:40.318 ]' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:40.318 { 00:06:40.318 "nbd_device": "/dev/nbd0", 00:06:40.318 "bdev_name": "Malloc0" 00:06:40.318 }, 00:06:40.318 { 00:06:40.318 "nbd_device": "/dev/nbd1", 00:06:40.318 "bdev_name": "Malloc1" 00:06:40.318 } 00:06:40.318 ]' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:40.318 /dev/nbd1' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:40.318 /dev/nbd1' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@65 -- # count=2 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@66 -- # echo 2 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@95 -- # count=2 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:40.318 256+0 records in 00:06:40.318 256+0 records out 00:06:40.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0109602 s, 95.7 MB/s 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:40.318 20:02:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:40.577 256+0 records in 00:06:40.577 256+0 records out 00:06:40.577 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0210804 s, 49.7 MB/s 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:40.577 256+0 records in 00:06:40.577 256+0 records out 00:06:40.577 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.022132 s, 47.4 MB/s 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@51 -- # local i 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:40.577 20:02:24 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@41 -- # break 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@45 -- # return 0 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:40.577 20:02:25 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@41 -- # break 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@45 -- # return 0 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:40.836 20:02:25 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@65 -- # echo '' 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@65 -- # true 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@65 -- # count=0 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@66 -- # echo 0 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@104 -- # count=0 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:41.095 20:02:25 -- bdev/nbd_common.sh@109 -- # return 0 00:06:41.095 20:02:25 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:41.354 20:02:25 -- event/event.sh@35 -- # sleep 3 00:06:41.613 [2024-04-26 20:02:25.821179] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:41.613 [2024-04-26 20:02:25.902052] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:41.613 [2024-04-26 20:02:25.902055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.613 [2024-04-26 20:02:25.948834] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:41.613 [2024-04-26 20:02:25.948884] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:44.901 20:02:28 -- event/event.sh@38 -- # waitforlisten 1608116 /var/tmp/spdk-nbd.sock 00:06:44.901 20:02:28 -- common/autotest_common.sh@827 -- # '[' -z 1608116 ']' 00:06:44.901 20:02:28 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:44.901 20:02:28 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:44.901 20:02:28 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:44.901 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:44.901 20:02:28 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:44.901 20:02:28 -- common/autotest_common.sh@10 -- # set +x 00:06:44.901 20:02:28 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:44.901 20:02:28 -- common/autotest_common.sh@860 -- # return 0 00:06:44.901 20:02:28 -- event/event.sh@39 -- # killprocess 1608116 00:06:44.901 20:02:28 -- common/autotest_common.sh@946 -- # '[' -z 1608116 ']' 00:06:44.901 20:02:28 -- common/autotest_common.sh@950 -- # kill -0 1608116 00:06:44.901 20:02:28 -- common/autotest_common.sh@951 -- # uname 00:06:44.901 20:02:28 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:44.901 20:02:28 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1608116 00:06:44.901 20:02:28 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:44.901 20:02:28 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:44.901 20:02:28 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1608116' 00:06:44.901 killing process with pid 1608116 00:06:44.901 20:02:28 -- common/autotest_common.sh@965 -- # kill 1608116 00:06:44.901 20:02:28 -- common/autotest_common.sh@970 -- # wait 1608116 00:06:44.901 spdk_app_start is called in Round 0. 00:06:44.901 Shutdown signal received, stop current app iteration 00:06:44.901 Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 reinitialization... 00:06:44.901 spdk_app_start is called in Round 1. 00:06:44.901 Shutdown signal received, stop current app iteration 00:06:44.901 Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 reinitialization... 00:06:44.901 spdk_app_start is called in Round 2. 00:06:44.901 Shutdown signal received, stop current app iteration 00:06:44.901 Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 reinitialization... 00:06:44.901 spdk_app_start is called in Round 3. 00:06:44.901 Shutdown signal received, stop current app iteration 00:06:44.901 20:02:29 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:44.901 20:02:29 -- event/event.sh@42 -- # return 0 00:06:44.901 00:06:44.901 real 0m16.364s 00:06:44.901 user 0m34.479s 00:06:44.901 sys 0m3.221s 00:06:44.901 20:02:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:44.901 20:02:29 -- common/autotest_common.sh@10 -- # set +x 00:06:44.901 ************************************ 00:06:44.901 END TEST app_repeat 00:06:44.901 ************************************ 00:06:44.901 20:02:29 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:44.901 20:02:29 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:44.901 20:02:29 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:44.902 20:02:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:44.902 20:02:29 -- common/autotest_common.sh@10 -- # set +x 00:06:44.902 ************************************ 00:06:44.902 START TEST cpu_locks 00:06:44.902 ************************************ 00:06:44.902 20:02:29 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:44.902 * Looking for test storage... 00:06:44.902 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:44.902 20:02:29 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:44.902 20:02:29 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:44.902 20:02:29 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:44.902 20:02:29 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:44.902 20:02:29 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:44.902 20:02:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:44.902 20:02:29 -- common/autotest_common.sh@10 -- # set +x 00:06:45.161 ************************************ 00:06:45.161 START TEST default_locks 00:06:45.161 ************************************ 00:06:45.161 20:02:29 -- common/autotest_common.sh@1121 -- # default_locks 00:06:45.161 20:02:29 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=1610530 00:06:45.161 20:02:29 -- event/cpu_locks.sh@47 -- # waitforlisten 1610530 00:06:45.161 20:02:29 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:45.161 20:02:29 -- common/autotest_common.sh@827 -- # '[' -z 1610530 ']' 00:06:45.161 20:02:29 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:45.161 20:02:29 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:45.161 20:02:29 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:45.161 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:45.161 20:02:29 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:45.161 20:02:29 -- common/autotest_common.sh@10 -- # set +x 00:06:45.161 [2024-04-26 20:02:29.486113] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:45.161 [2024-04-26 20:02:29.486194] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610530 ] 00:06:45.161 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.161 [2024-04-26 20:02:29.570529] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.420 [2024-04-26 20:02:29.652238] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.988 20:02:30 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:45.988 20:02:30 -- common/autotest_common.sh@860 -- # return 0 00:06:45.988 20:02:30 -- event/cpu_locks.sh@49 -- # locks_exist 1610530 00:06:45.988 20:02:30 -- event/cpu_locks.sh@22 -- # lslocks -p 1610530 00:06:45.988 20:02:30 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:46.924 lslocks: write error 00:06:46.924 20:02:31 -- event/cpu_locks.sh@50 -- # killprocess 1610530 00:06:46.924 20:02:31 -- common/autotest_common.sh@946 -- # '[' -z 1610530 ']' 00:06:46.924 20:02:31 -- common/autotest_common.sh@950 -- # kill -0 1610530 00:06:46.924 20:02:31 -- common/autotest_common.sh@951 -- # uname 00:06:46.924 20:02:31 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:46.924 20:02:31 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1610530 00:06:46.924 20:02:31 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:46.924 20:02:31 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:46.924 20:02:31 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1610530' 00:06:46.924 killing process with pid 1610530 00:06:46.924 20:02:31 -- common/autotest_common.sh@965 -- # kill 1610530 00:06:46.924 20:02:31 -- common/autotest_common.sh@970 -- # wait 1610530 00:06:47.183 20:02:31 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 1610530 00:06:47.183 20:02:31 -- common/autotest_common.sh@648 -- # local es=0 00:06:47.183 20:02:31 -- common/autotest_common.sh@650 -- # valid_exec_arg waitforlisten 1610530 00:06:47.183 20:02:31 -- common/autotest_common.sh@636 -- # local arg=waitforlisten 00:06:47.183 20:02:31 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:47.183 20:02:31 -- common/autotest_common.sh@640 -- # type -t waitforlisten 00:06:47.183 20:02:31 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:47.183 20:02:31 -- common/autotest_common.sh@651 -- # waitforlisten 1610530 00:06:47.183 20:02:31 -- common/autotest_common.sh@827 -- # '[' -z 1610530 ']' 00:06:47.183 20:02:31 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.183 20:02:31 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:47.183 20:02:31 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.183 20:02:31 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:47.183 20:02:31 -- common/autotest_common.sh@10 -- # set +x 00:06:47.183 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 842: kill: (1610530) - No such process 00:06:47.183 ERROR: process (pid: 1610530) is no longer running 00:06:47.183 20:02:31 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:47.183 20:02:31 -- common/autotest_common.sh@860 -- # return 1 00:06:47.183 20:02:31 -- common/autotest_common.sh@651 -- # es=1 00:06:47.183 20:02:31 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:47.183 20:02:31 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:47.183 20:02:31 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:47.183 20:02:31 -- event/cpu_locks.sh@54 -- # no_locks 00:06:47.183 20:02:31 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:47.183 20:02:31 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:47.183 20:02:31 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:47.183 00:06:47.183 real 0m1.986s 00:06:47.183 user 0m2.045s 00:06:47.183 sys 0m0.781s 00:06:47.183 20:02:31 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:47.183 20:02:31 -- common/autotest_common.sh@10 -- # set +x 00:06:47.183 ************************************ 00:06:47.183 END TEST default_locks 00:06:47.183 ************************************ 00:06:47.183 20:02:31 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:47.183 20:02:31 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:47.183 20:02:31 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:47.183 20:02:31 -- common/autotest_common.sh@10 -- # set +x 00:06:47.442 ************************************ 00:06:47.442 START TEST default_locks_via_rpc 00:06:47.442 ************************************ 00:06:47.442 20:02:31 -- common/autotest_common.sh@1121 -- # default_locks_via_rpc 00:06:47.442 20:02:31 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=1610914 00:06:47.442 20:02:31 -- event/cpu_locks.sh@63 -- # waitforlisten 1610914 00:06:47.442 20:02:31 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:47.442 20:02:31 -- common/autotest_common.sh@827 -- # '[' -z 1610914 ']' 00:06:47.442 20:02:31 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:47.442 20:02:31 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:47.442 20:02:31 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:47.442 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:47.442 20:02:31 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:47.442 20:02:31 -- common/autotest_common.sh@10 -- # set +x 00:06:47.442 [2024-04-26 20:02:31.672964] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:47.442 [2024-04-26 20:02:31.673037] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1610914 ] 00:06:47.442 EAL: No free 2048 kB hugepages reported on node 1 00:06:47.442 [2024-04-26 20:02:31.757087] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.442 [2024-04-26 20:02:31.840178] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.379 20:02:32 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:48.379 20:02:32 -- common/autotest_common.sh@860 -- # return 0 00:06:48.379 20:02:32 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:48.379 20:02:32 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:48.379 20:02:32 -- common/autotest_common.sh@10 -- # set +x 00:06:48.379 20:02:32 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:48.379 20:02:32 -- event/cpu_locks.sh@67 -- # no_locks 00:06:48.379 20:02:32 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:48.379 20:02:32 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:48.379 20:02:32 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:48.379 20:02:32 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:48.379 20:02:32 -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:48.379 20:02:32 -- common/autotest_common.sh@10 -- # set +x 00:06:48.379 20:02:32 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:48.379 20:02:32 -- event/cpu_locks.sh@71 -- # locks_exist 1610914 00:06:48.379 20:02:32 -- event/cpu_locks.sh@22 -- # lslocks -p 1610914 00:06:48.379 20:02:32 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:48.379 20:02:32 -- event/cpu_locks.sh@73 -- # killprocess 1610914 00:06:48.379 20:02:32 -- common/autotest_common.sh@946 -- # '[' -z 1610914 ']' 00:06:48.379 20:02:32 -- common/autotest_common.sh@950 -- # kill -0 1610914 00:06:48.379 20:02:32 -- common/autotest_common.sh@951 -- # uname 00:06:48.379 20:02:32 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:48.379 20:02:32 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1610914 00:06:48.638 20:02:32 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:48.638 20:02:32 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:48.638 20:02:32 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1610914' 00:06:48.638 killing process with pid 1610914 00:06:48.638 20:02:32 -- common/autotest_common.sh@965 -- # kill 1610914 00:06:48.638 20:02:32 -- common/autotest_common.sh@970 -- # wait 1610914 00:06:48.897 00:06:48.897 real 0m1.547s 00:06:48.897 user 0m1.560s 00:06:48.897 sys 0m0.560s 00:06:48.897 20:02:33 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:48.897 20:02:33 -- common/autotest_common.sh@10 -- # set +x 00:06:48.897 ************************************ 00:06:48.897 END TEST default_locks_via_rpc 00:06:48.897 ************************************ 00:06:48.897 20:02:33 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:48.897 20:02:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:48.897 20:02:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:48.897 20:02:33 -- common/autotest_common.sh@10 -- # set +x 00:06:49.154 ************************************ 00:06:49.154 START TEST non_locking_app_on_locked_coremask 00:06:49.154 ************************************ 00:06:49.154 20:02:33 -- common/autotest_common.sh@1121 -- # non_locking_app_on_locked_coremask 00:06:49.154 20:02:33 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=1611129 00:06:49.154 20:02:33 -- event/cpu_locks.sh@81 -- # waitforlisten 1611129 /var/tmp/spdk.sock 00:06:49.154 20:02:33 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:49.154 20:02:33 -- common/autotest_common.sh@827 -- # '[' -z 1611129 ']' 00:06:49.154 20:02:33 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:49.154 20:02:33 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:49.154 20:02:33 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:49.154 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:49.154 20:02:33 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:49.154 20:02:33 -- common/autotest_common.sh@10 -- # set +x 00:06:49.154 [2024-04-26 20:02:33.429615] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:49.154 [2024-04-26 20:02:33.429700] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611129 ] 00:06:49.154 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.154 [2024-04-26 20:02:33.515243] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.413 [2024-04-26 20:02:33.606798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.978 20:02:34 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:49.978 20:02:34 -- common/autotest_common.sh@860 -- # return 0 00:06:49.978 20:02:34 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=1611301 00:06:49.978 20:02:34 -- event/cpu_locks.sh@85 -- # waitforlisten 1611301 /var/tmp/spdk2.sock 00:06:49.978 20:02:34 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:49.978 20:02:34 -- common/autotest_common.sh@827 -- # '[' -z 1611301 ']' 00:06:49.978 20:02:34 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:49.978 20:02:34 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:49.978 20:02:34 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:49.978 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:49.978 20:02:34 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:49.978 20:02:34 -- common/autotest_common.sh@10 -- # set +x 00:06:49.978 [2024-04-26 20:02:34.284352] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:49.978 [2024-04-26 20:02:34.284437] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611301 ] 00:06:49.978 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.978 [2024-04-26 20:02:34.394806] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:49.978 [2024-04-26 20:02:34.394828] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.237 [2024-04-26 20:02:34.556771] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.805 20:02:35 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:50.805 20:02:35 -- common/autotest_common.sh@860 -- # return 0 00:06:50.805 20:02:35 -- event/cpu_locks.sh@87 -- # locks_exist 1611129 00:06:50.805 20:02:35 -- event/cpu_locks.sh@22 -- # lslocks -p 1611129 00:06:50.805 20:02:35 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:51.740 lslocks: write error 00:06:51.740 20:02:36 -- event/cpu_locks.sh@89 -- # killprocess 1611129 00:06:51.740 20:02:36 -- common/autotest_common.sh@946 -- # '[' -z 1611129 ']' 00:06:51.740 20:02:36 -- common/autotest_common.sh@950 -- # kill -0 1611129 00:06:51.740 20:02:36 -- common/autotest_common.sh@951 -- # uname 00:06:51.740 20:02:36 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:51.740 20:02:36 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1611129 00:06:51.999 20:02:36 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:51.999 20:02:36 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:51.999 20:02:36 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1611129' 00:06:51.999 killing process with pid 1611129 00:06:51.999 20:02:36 -- common/autotest_common.sh@965 -- # kill 1611129 00:06:51.999 20:02:36 -- common/autotest_common.sh@970 -- # wait 1611129 00:06:52.565 20:02:36 -- event/cpu_locks.sh@90 -- # killprocess 1611301 00:06:52.565 20:02:36 -- common/autotest_common.sh@946 -- # '[' -z 1611301 ']' 00:06:52.565 20:02:36 -- common/autotest_common.sh@950 -- # kill -0 1611301 00:06:52.565 20:02:36 -- common/autotest_common.sh@951 -- # uname 00:06:52.565 20:02:36 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:52.565 20:02:36 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1611301 00:06:52.565 20:02:36 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:52.565 20:02:36 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:52.565 20:02:36 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1611301' 00:06:52.565 killing process with pid 1611301 00:06:52.565 20:02:36 -- common/autotest_common.sh@965 -- # kill 1611301 00:06:52.565 20:02:36 -- common/autotest_common.sh@970 -- # wait 1611301 00:06:53.130 00:06:53.130 real 0m3.873s 00:06:53.130 user 0m4.046s 00:06:53.130 sys 0m1.315s 00:06:53.130 20:02:37 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:53.130 20:02:37 -- common/autotest_common.sh@10 -- # set +x 00:06:53.130 ************************************ 00:06:53.130 END TEST non_locking_app_on_locked_coremask 00:06:53.130 ************************************ 00:06:53.130 20:02:37 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:53.130 20:02:37 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:53.130 20:02:37 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:53.130 20:02:37 -- common/autotest_common.sh@10 -- # set +x 00:06:53.130 ************************************ 00:06:53.130 START TEST locking_app_on_unlocked_coremask 00:06:53.130 ************************************ 00:06:53.130 20:02:37 -- common/autotest_common.sh@1121 -- # locking_app_on_unlocked_coremask 00:06:53.130 20:02:37 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=1611707 00:06:53.130 20:02:37 -- event/cpu_locks.sh@99 -- # waitforlisten 1611707 /var/tmp/spdk.sock 00:06:53.130 20:02:37 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:53.130 20:02:37 -- common/autotest_common.sh@827 -- # '[' -z 1611707 ']' 00:06:53.130 20:02:37 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:53.130 20:02:37 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:53.130 20:02:37 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:53.130 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:53.130 20:02:37 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:53.130 20:02:37 -- common/autotest_common.sh@10 -- # set +x 00:06:53.130 [2024-04-26 20:02:37.513654] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:53.130 [2024-04-26 20:02:37.513734] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611707 ] 00:06:53.130 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.388 [2024-04-26 20:02:37.594764] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:53.388 [2024-04-26 20:02:37.594786] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.388 [2024-04-26 20:02:37.675496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.954 20:02:38 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:53.954 20:02:38 -- common/autotest_common.sh@860 -- # return 0 00:06:53.954 20:02:38 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:53.954 20:02:38 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=1611872 00:06:53.954 20:02:38 -- event/cpu_locks.sh@103 -- # waitforlisten 1611872 /var/tmp/spdk2.sock 00:06:53.954 20:02:38 -- common/autotest_common.sh@827 -- # '[' -z 1611872 ']' 00:06:53.954 20:02:38 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:53.954 20:02:38 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:53.954 20:02:38 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:53.954 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:53.954 20:02:38 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:53.955 20:02:38 -- common/autotest_common.sh@10 -- # set +x 00:06:53.955 [2024-04-26 20:02:38.370638] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:53.955 [2024-04-26 20:02:38.370735] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1611872 ] 00:06:54.230 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.230 [2024-04-26 20:02:38.483753] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.230 [2024-04-26 20:02:38.656482] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.876 20:02:39 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:54.876 20:02:39 -- common/autotest_common.sh@860 -- # return 0 00:06:54.876 20:02:39 -- event/cpu_locks.sh@105 -- # locks_exist 1611872 00:06:54.876 20:02:39 -- event/cpu_locks.sh@22 -- # lslocks -p 1611872 00:06:54.876 20:02:39 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:56.253 lslocks: write error 00:06:56.253 20:02:40 -- event/cpu_locks.sh@107 -- # killprocess 1611707 00:06:56.253 20:02:40 -- common/autotest_common.sh@946 -- # '[' -z 1611707 ']' 00:06:56.253 20:02:40 -- common/autotest_common.sh@950 -- # kill -0 1611707 00:06:56.253 20:02:40 -- common/autotest_common.sh@951 -- # uname 00:06:56.253 20:02:40 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:56.253 20:02:40 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1611707 00:06:56.253 20:02:40 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:56.253 20:02:40 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:56.253 20:02:40 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1611707' 00:06:56.253 killing process with pid 1611707 00:06:56.253 20:02:40 -- common/autotest_common.sh@965 -- # kill 1611707 00:06:56.253 20:02:40 -- common/autotest_common.sh@970 -- # wait 1611707 00:06:56.821 20:02:41 -- event/cpu_locks.sh@108 -- # killprocess 1611872 00:06:56.821 20:02:41 -- common/autotest_common.sh@946 -- # '[' -z 1611872 ']' 00:06:56.821 20:02:41 -- common/autotest_common.sh@950 -- # kill -0 1611872 00:06:56.821 20:02:41 -- common/autotest_common.sh@951 -- # uname 00:06:56.821 20:02:41 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:56.821 20:02:41 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1611872 00:06:56.821 20:02:41 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:56.821 20:02:41 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:56.821 20:02:41 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1611872' 00:06:56.821 killing process with pid 1611872 00:06:56.821 20:02:41 -- common/autotest_common.sh@965 -- # kill 1611872 00:06:56.821 20:02:41 -- common/autotest_common.sh@970 -- # wait 1611872 00:06:57.080 00:06:57.080 real 0m3.996s 00:06:57.080 user 0m4.187s 00:06:57.080 sys 0m1.348s 00:06:57.080 20:02:41 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:57.080 20:02:41 -- common/autotest_common.sh@10 -- # set +x 00:06:57.080 ************************************ 00:06:57.080 END TEST locking_app_on_unlocked_coremask 00:06:57.080 ************************************ 00:06:57.338 20:02:41 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:57.338 20:02:41 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:57.338 20:02:41 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:57.338 20:02:41 -- common/autotest_common.sh@10 -- # set +x 00:06:57.338 ************************************ 00:06:57.338 START TEST locking_app_on_locked_coremask 00:06:57.338 ************************************ 00:06:57.338 20:02:41 -- common/autotest_common.sh@1121 -- # locking_app_on_locked_coremask 00:06:57.338 20:02:41 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=1612281 00:06:57.338 20:02:41 -- event/cpu_locks.sh@116 -- # waitforlisten 1612281 /var/tmp/spdk.sock 00:06:57.338 20:02:41 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:57.338 20:02:41 -- common/autotest_common.sh@827 -- # '[' -z 1612281 ']' 00:06:57.339 20:02:41 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:57.339 20:02:41 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:57.339 20:02:41 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:57.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:57.339 20:02:41 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:57.339 20:02:41 -- common/autotest_common.sh@10 -- # set +x 00:06:57.339 [2024-04-26 20:02:41.721091] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:57.339 [2024-04-26 20:02:41.721165] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612281 ] 00:06:57.339 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.597 [2024-04-26 20:02:41.806456] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.597 [2024-04-26 20:02:41.897317] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.166 20:02:42 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:58.166 20:02:42 -- common/autotest_common.sh@860 -- # return 0 00:06:58.166 20:02:42 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:58.166 20:02:42 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=1612457 00:06:58.166 20:02:42 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 1612457 /var/tmp/spdk2.sock 00:06:58.166 20:02:42 -- common/autotest_common.sh@648 -- # local es=0 00:06:58.166 20:02:42 -- common/autotest_common.sh@650 -- # valid_exec_arg waitforlisten 1612457 /var/tmp/spdk2.sock 00:06:58.166 20:02:42 -- common/autotest_common.sh@636 -- # local arg=waitforlisten 00:06:58.166 20:02:42 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:58.166 20:02:42 -- common/autotest_common.sh@640 -- # type -t waitforlisten 00:06:58.166 20:02:42 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:06:58.166 20:02:42 -- common/autotest_common.sh@651 -- # waitforlisten 1612457 /var/tmp/spdk2.sock 00:06:58.166 20:02:42 -- common/autotest_common.sh@827 -- # '[' -z 1612457 ']' 00:06:58.166 20:02:42 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:58.166 20:02:42 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:58.166 20:02:42 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:58.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:58.166 20:02:42 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:58.166 20:02:42 -- common/autotest_common.sh@10 -- # set +x 00:06:58.166 [2024-04-26 20:02:42.561012] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:58.166 [2024-04-26 20:02:42.561065] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612457 ] 00:06:58.166 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.425 [2024-04-26 20:02:42.670198] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 1612281 has claimed it. 00:06:58.425 [2024-04-26 20:02:42.670229] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:58.993 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 842: kill: (1612457) - No such process 00:06:58.993 ERROR: process (pid: 1612457) is no longer running 00:06:58.993 20:02:43 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:06:58.993 20:02:43 -- common/autotest_common.sh@860 -- # return 1 00:06:58.993 20:02:43 -- common/autotest_common.sh@651 -- # es=1 00:06:58.993 20:02:43 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:06:58.993 20:02:43 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:06:58.993 20:02:43 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:06:58.993 20:02:43 -- event/cpu_locks.sh@122 -- # locks_exist 1612281 00:06:58.993 20:02:43 -- event/cpu_locks.sh@22 -- # lslocks -p 1612281 00:06:58.993 20:02:43 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:59.252 lslocks: write error 00:06:59.252 20:02:43 -- event/cpu_locks.sh@124 -- # killprocess 1612281 00:06:59.252 20:02:43 -- common/autotest_common.sh@946 -- # '[' -z 1612281 ']' 00:06:59.252 20:02:43 -- common/autotest_common.sh@950 -- # kill -0 1612281 00:06:59.252 20:02:43 -- common/autotest_common.sh@951 -- # uname 00:06:59.252 20:02:43 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:06:59.252 20:02:43 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1612281 00:06:59.252 20:02:43 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:06:59.252 20:02:43 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:06:59.252 20:02:43 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1612281' 00:06:59.252 killing process with pid 1612281 00:06:59.252 20:02:43 -- common/autotest_common.sh@965 -- # kill 1612281 00:06:59.252 20:02:43 -- common/autotest_common.sh@970 -- # wait 1612281 00:06:59.821 00:06:59.821 real 0m2.284s 00:06:59.821 user 0m2.408s 00:06:59.821 sys 0m0.727s 00:06:59.821 20:02:43 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:06:59.821 20:02:43 -- common/autotest_common.sh@10 -- # set +x 00:06:59.821 ************************************ 00:06:59.821 END TEST locking_app_on_locked_coremask 00:06:59.821 ************************************ 00:06:59.821 20:02:44 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:59.821 20:02:44 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:06:59.821 20:02:44 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:06:59.821 20:02:44 -- common/autotest_common.sh@10 -- # set +x 00:06:59.821 ************************************ 00:06:59.821 START TEST locking_overlapped_coremask 00:06:59.821 ************************************ 00:06:59.821 20:02:44 -- common/autotest_common.sh@1121 -- # locking_overlapped_coremask 00:06:59.821 20:02:44 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=1612668 00:06:59.821 20:02:44 -- event/cpu_locks.sh@133 -- # waitforlisten 1612668 /var/tmp/spdk.sock 00:06:59.821 20:02:44 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:59.821 20:02:44 -- common/autotest_common.sh@827 -- # '[' -z 1612668 ']' 00:06:59.821 20:02:44 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:59.821 20:02:44 -- common/autotest_common.sh@832 -- # local max_retries=100 00:06:59.821 20:02:44 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:59.821 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:59.821 20:02:44 -- common/autotest_common.sh@836 -- # xtrace_disable 00:06:59.821 20:02:44 -- common/autotest_common.sh@10 -- # set +x 00:06:59.821 [2024-04-26 20:02:44.211288] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:06:59.821 [2024-04-26 20:02:44.211350] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612668 ] 00:06:59.821 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.080 [2024-04-26 20:02:44.294630] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:00.080 [2024-04-26 20:02:44.379288] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:00.080 [2024-04-26 20:02:44.379375] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:00.080 [2024-04-26 20:02:44.379377] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.648 20:02:45 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:00.648 20:02:45 -- common/autotest_common.sh@860 -- # return 0 00:07:00.648 20:02:45 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=1612846 00:07:00.648 20:02:45 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 1612846 /var/tmp/spdk2.sock 00:07:00.648 20:02:45 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:07:00.648 20:02:45 -- common/autotest_common.sh@648 -- # local es=0 00:07:00.648 20:02:45 -- common/autotest_common.sh@650 -- # valid_exec_arg waitforlisten 1612846 /var/tmp/spdk2.sock 00:07:00.648 20:02:45 -- common/autotest_common.sh@636 -- # local arg=waitforlisten 00:07:00.648 20:02:45 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:00.648 20:02:45 -- common/autotest_common.sh@640 -- # type -t waitforlisten 00:07:00.648 20:02:45 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:00.648 20:02:45 -- common/autotest_common.sh@651 -- # waitforlisten 1612846 /var/tmp/spdk2.sock 00:07:00.648 20:02:45 -- common/autotest_common.sh@827 -- # '[' -z 1612846 ']' 00:07:00.648 20:02:45 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:00.648 20:02:45 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:00.648 20:02:45 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:00.648 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:00.648 20:02:45 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:00.648 20:02:45 -- common/autotest_common.sh@10 -- # set +x 00:07:00.648 [2024-04-26 20:02:45.060450] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:00.648 [2024-04-26 20:02:45.060523] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1612846 ] 00:07:00.906 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.906 [2024-04-26 20:02:45.173338] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1612668 has claimed it. 00:07:00.906 [2024-04-26 20:02:45.173375] app.c: 821:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:07:01.473 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 842: kill: (1612846) - No such process 00:07:01.473 ERROR: process (pid: 1612846) is no longer running 00:07:01.473 20:02:45 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:01.473 20:02:45 -- common/autotest_common.sh@860 -- # return 1 00:07:01.473 20:02:45 -- common/autotest_common.sh@651 -- # es=1 00:07:01.473 20:02:45 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:01.473 20:02:45 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:01.473 20:02:45 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:01.473 20:02:45 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:07:01.473 20:02:45 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:01.473 20:02:45 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:01.473 20:02:45 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:01.473 20:02:45 -- event/cpu_locks.sh@141 -- # killprocess 1612668 00:07:01.473 20:02:45 -- common/autotest_common.sh@946 -- # '[' -z 1612668 ']' 00:07:01.473 20:02:45 -- common/autotest_common.sh@950 -- # kill -0 1612668 00:07:01.473 20:02:45 -- common/autotest_common.sh@951 -- # uname 00:07:01.473 20:02:45 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:01.473 20:02:45 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1612668 00:07:01.473 20:02:45 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:01.473 20:02:45 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:01.473 20:02:45 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1612668' 00:07:01.473 killing process with pid 1612668 00:07:01.474 20:02:45 -- common/autotest_common.sh@965 -- # kill 1612668 00:07:01.474 20:02:45 -- common/autotest_common.sh@970 -- # wait 1612668 00:07:01.732 00:07:01.732 real 0m1.896s 00:07:01.733 user 0m5.241s 00:07:01.733 sys 0m0.492s 00:07:01.733 20:02:46 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:01.733 20:02:46 -- common/autotest_common.sh@10 -- # set +x 00:07:01.733 ************************************ 00:07:01.733 END TEST locking_overlapped_coremask 00:07:01.733 ************************************ 00:07:01.733 20:02:46 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:07:01.733 20:02:46 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:01.733 20:02:46 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:01.733 20:02:46 -- common/autotest_common.sh@10 -- # set +x 00:07:01.992 ************************************ 00:07:01.992 START TEST locking_overlapped_coremask_via_rpc 00:07:01.992 ************************************ 00:07:01.992 20:02:46 -- common/autotest_common.sh@1121 -- # locking_overlapped_coremask_via_rpc 00:07:01.992 20:02:46 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=1613056 00:07:01.992 20:02:46 -- event/cpu_locks.sh@149 -- # waitforlisten 1613056 /var/tmp/spdk.sock 00:07:01.992 20:02:46 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:07:01.992 20:02:46 -- common/autotest_common.sh@827 -- # '[' -z 1613056 ']' 00:07:01.992 20:02:46 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:01.992 20:02:46 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:01.992 20:02:46 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:01.992 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:01.992 20:02:46 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:01.992 20:02:46 -- common/autotest_common.sh@10 -- # set +x 00:07:01.992 [2024-04-26 20:02:46.302431] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:01.992 [2024-04-26 20:02:46.302503] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613056 ] 00:07:01.992 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.992 [2024-04-26 20:02:46.386515] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:01.992 [2024-04-26 20:02:46.386545] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:02.251 [2024-04-26 20:02:46.474970] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:02.251 [2024-04-26 20:02:46.475055] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:02.251 [2024-04-26 20:02:46.475057] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.818 20:02:47 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:02.818 20:02:47 -- common/autotest_common.sh@860 -- # return 0 00:07:02.819 20:02:47 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:07:02.819 20:02:47 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=1613076 00:07:02.819 20:02:47 -- event/cpu_locks.sh@153 -- # waitforlisten 1613076 /var/tmp/spdk2.sock 00:07:02.819 20:02:47 -- common/autotest_common.sh@827 -- # '[' -z 1613076 ']' 00:07:02.819 20:02:47 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:02.819 20:02:47 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:02.819 20:02:47 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:02.819 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:02.819 20:02:47 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:02.819 20:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:02.819 [2024-04-26 20:02:47.140506] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:02.819 [2024-04-26 20:02:47.140570] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613076 ] 00:07:02.819 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.819 [2024-04-26 20:02:47.250274] app.c: 825:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:07:02.819 [2024-04-26 20:02:47.250304] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:03.077 [2024-04-26 20:02:47.417683] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:03.077 [2024-04-26 20:02:47.417792] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:03.077 [2024-04-26 20:02:47.417793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:07:03.643 20:02:47 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:03.643 20:02:47 -- common/autotest_common.sh@860 -- # return 0 00:07:03.643 20:02:47 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:07:03.643 20:02:47 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.643 20:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:03.643 20:02:47 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:03.643 20:02:47 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:03.643 20:02:47 -- common/autotest_common.sh@648 -- # local es=0 00:07:03.643 20:02:47 -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:03.643 20:02:47 -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:07:03.643 20:02:47 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:03.643 20:02:47 -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:07:03.643 20:02:47 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:03.643 20:02:47 -- common/autotest_common.sh@651 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:07:03.643 20:02:47 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:03.643 20:02:47 -- common/autotest_common.sh@10 -- # set +x 00:07:03.643 [2024-04-26 20:02:48.009947] app.c: 691:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 1613056 has claimed it. 00:07:03.643 request: 00:07:03.643 { 00:07:03.643 "method": "framework_enable_cpumask_locks", 00:07:03.643 "req_id": 1 00:07:03.643 } 00:07:03.643 Got JSON-RPC error response 00:07:03.643 response: 00:07:03.643 { 00:07:03.643 "code": -32603, 00:07:03.643 "message": "Failed to claim CPU core: 2" 00:07:03.643 } 00:07:03.643 20:02:48 -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:03.643 20:02:48 -- common/autotest_common.sh@651 -- # es=1 00:07:03.643 20:02:48 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:03.643 20:02:48 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:03.643 20:02:48 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:03.643 20:02:48 -- event/cpu_locks.sh@158 -- # waitforlisten 1613056 /var/tmp/spdk.sock 00:07:03.643 20:02:48 -- common/autotest_common.sh@827 -- # '[' -z 1613056 ']' 00:07:03.643 20:02:48 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:03.643 20:02:48 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:03.644 20:02:48 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:03.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:03.644 20:02:48 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:03.644 20:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:03.902 20:02:48 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:03.902 20:02:48 -- common/autotest_common.sh@860 -- # return 0 00:07:03.902 20:02:48 -- event/cpu_locks.sh@159 -- # waitforlisten 1613076 /var/tmp/spdk2.sock 00:07:03.902 20:02:48 -- common/autotest_common.sh@827 -- # '[' -z 1613076 ']' 00:07:03.902 20:02:48 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk2.sock 00:07:03.902 20:02:48 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:03.902 20:02:48 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:07:03.902 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:07:03.902 20:02:48 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:03.902 20:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:04.160 20:02:48 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:04.160 20:02:48 -- common/autotest_common.sh@860 -- # return 0 00:07:04.160 20:02:48 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:07:04.160 20:02:48 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:07:04.160 20:02:48 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:07:04.160 20:02:48 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:07:04.160 00:07:04.160 real 0m2.090s 00:07:04.160 user 0m0.805s 00:07:04.160 sys 0m0.210s 00:07:04.160 20:02:48 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:04.160 20:02:48 -- common/autotest_common.sh@10 -- # set +x 00:07:04.160 ************************************ 00:07:04.160 END TEST locking_overlapped_coremask_via_rpc 00:07:04.160 ************************************ 00:07:04.160 20:02:48 -- event/cpu_locks.sh@174 -- # cleanup 00:07:04.160 20:02:48 -- event/cpu_locks.sh@15 -- # [[ -z 1613056 ]] 00:07:04.160 20:02:48 -- event/cpu_locks.sh@15 -- # killprocess 1613056 00:07:04.160 20:02:48 -- common/autotest_common.sh@946 -- # '[' -z 1613056 ']' 00:07:04.160 20:02:48 -- common/autotest_common.sh@950 -- # kill -0 1613056 00:07:04.160 20:02:48 -- common/autotest_common.sh@951 -- # uname 00:07:04.160 20:02:48 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:04.160 20:02:48 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1613056 00:07:04.160 20:02:48 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:04.160 20:02:48 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:04.160 20:02:48 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1613056' 00:07:04.160 killing process with pid 1613056 00:07:04.160 20:02:48 -- common/autotest_common.sh@965 -- # kill 1613056 00:07:04.160 20:02:48 -- common/autotest_common.sh@970 -- # wait 1613056 00:07:04.418 20:02:48 -- event/cpu_locks.sh@16 -- # [[ -z 1613076 ]] 00:07:04.418 20:02:48 -- event/cpu_locks.sh@16 -- # killprocess 1613076 00:07:04.418 20:02:48 -- common/autotest_common.sh@946 -- # '[' -z 1613076 ']' 00:07:04.418 20:02:48 -- common/autotest_common.sh@950 -- # kill -0 1613076 00:07:04.418 20:02:48 -- common/autotest_common.sh@951 -- # uname 00:07:04.418 20:02:48 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:04.418 20:02:48 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1613076 00:07:04.418 20:02:48 -- common/autotest_common.sh@952 -- # process_name=reactor_2 00:07:04.418 20:02:48 -- common/autotest_common.sh@956 -- # '[' reactor_2 = sudo ']' 00:07:04.418 20:02:48 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1613076' 00:07:04.418 killing process with pid 1613076 00:07:04.418 20:02:48 -- common/autotest_common.sh@965 -- # kill 1613076 00:07:04.418 20:02:48 -- common/autotest_common.sh@970 -- # wait 1613076 00:07:04.985 20:02:49 -- event/cpu_locks.sh@18 -- # rm -f 00:07:04.985 20:02:49 -- event/cpu_locks.sh@1 -- # cleanup 00:07:04.985 20:02:49 -- event/cpu_locks.sh@15 -- # [[ -z 1613056 ]] 00:07:04.985 20:02:49 -- event/cpu_locks.sh@15 -- # killprocess 1613056 00:07:04.985 20:02:49 -- common/autotest_common.sh@946 -- # '[' -z 1613056 ']' 00:07:04.985 20:02:49 -- common/autotest_common.sh@950 -- # kill -0 1613056 00:07:04.985 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 950: kill: (1613056) - No such process 00:07:04.985 20:02:49 -- common/autotest_common.sh@973 -- # echo 'Process with pid 1613056 is not found' 00:07:04.985 Process with pid 1613056 is not found 00:07:04.985 20:02:49 -- event/cpu_locks.sh@16 -- # [[ -z 1613076 ]] 00:07:04.985 20:02:49 -- event/cpu_locks.sh@16 -- # killprocess 1613076 00:07:04.985 20:02:49 -- common/autotest_common.sh@946 -- # '[' -z 1613076 ']' 00:07:04.985 20:02:49 -- common/autotest_common.sh@950 -- # kill -0 1613076 00:07:04.985 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 950: kill: (1613076) - No such process 00:07:04.985 20:02:49 -- common/autotest_common.sh@973 -- # echo 'Process with pid 1613076 is not found' 00:07:04.985 Process with pid 1613076 is not found 00:07:04.985 20:02:49 -- event/cpu_locks.sh@18 -- # rm -f 00:07:04.985 00:07:04.985 real 0m20.001s 00:07:04.985 user 0m31.212s 00:07:04.985 sys 0m6.921s 00:07:04.985 20:02:49 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:04.985 20:02:49 -- common/autotest_common.sh@10 -- # set +x 00:07:04.985 ************************************ 00:07:04.985 END TEST cpu_locks 00:07:04.985 ************************************ 00:07:04.985 00:07:04.985 real 0m46.903s 00:07:04.985 user 1m21.474s 00:07:04.985 sys 0m11.797s 00:07:04.985 20:02:49 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:04.985 20:02:49 -- common/autotest_common.sh@10 -- # set +x 00:07:04.985 ************************************ 00:07:04.985 END TEST event 00:07:04.985 ************************************ 00:07:04.985 20:02:49 -- spdk/autotest.sh@178 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:04.985 20:02:49 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:04.985 20:02:49 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:04.985 20:02:49 -- common/autotest_common.sh@10 -- # set +x 00:07:05.243 ************************************ 00:07:05.243 START TEST thread 00:07:05.243 ************************************ 00:07:05.243 20:02:49 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:07:05.243 * Looking for test storage... 00:07:05.243 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:07:05.243 20:02:49 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:05.243 20:02:49 -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:05.243 20:02:49 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:05.243 20:02:49 -- common/autotest_common.sh@10 -- # set +x 00:07:05.243 ************************************ 00:07:05.243 START TEST thread_poller_perf 00:07:05.243 ************************************ 00:07:05.243 20:02:49 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:07:05.501 [2024-04-26 20:02:49.697946] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:05.501 [2024-04-26 20:02:49.698027] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613540 ] 00:07:05.501 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.501 [2024-04-26 20:02:49.781270] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.501 [2024-04-26 20:02:49.861707] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.501 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:07:06.874 ====================================== 00:07:06.874 busy:2304836894 (cyc) 00:07:06.874 total_run_count: 819000 00:07:06.874 tsc_hz: 2300000000 (cyc) 00:07:06.874 ====================================== 00:07:06.874 poller_cost: 2814 (cyc), 1223 (nsec) 00:07:06.874 00:07:06.874 real 0m1.256s 00:07:06.874 user 0m1.154s 00:07:06.874 sys 0m0.096s 00:07:06.874 20:02:50 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:06.874 20:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:06.874 ************************************ 00:07:06.874 END TEST thread_poller_perf 00:07:06.874 ************************************ 00:07:06.874 20:02:50 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:06.874 20:02:50 -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:06.874 20:02:50 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:06.874 20:02:50 -- common/autotest_common.sh@10 -- # set +x 00:07:06.874 ************************************ 00:07:06.874 START TEST thread_poller_perf 00:07:06.874 ************************************ 00:07:06.874 20:02:51 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:07:06.874 [2024-04-26 20:02:51.115710] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:06.874 [2024-04-26 20:02:51.115794] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613742 ] 00:07:06.874 EAL: No free 2048 kB hugepages reported on node 1 00:07:06.874 [2024-04-26 20:02:51.198481] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.874 [2024-04-26 20:02:51.279345] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.874 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:07:08.249 ====================================== 00:07:08.249 busy:2301248196 (cyc) 00:07:08.249 total_run_count: 13560000 00:07:08.249 tsc_hz: 2300000000 (cyc) 00:07:08.249 ====================================== 00:07:08.249 poller_cost: 169 (cyc), 73 (nsec) 00:07:08.249 00:07:08.249 real 0m1.254s 00:07:08.249 user 0m1.150s 00:07:08.249 sys 0m0.099s 00:07:08.249 20:02:52 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:08.249 20:02:52 -- common/autotest_common.sh@10 -- # set +x 00:07:08.249 ************************************ 00:07:08.249 END TEST thread_poller_perf 00:07:08.249 ************************************ 00:07:08.249 20:02:52 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:07:08.249 20:02:52 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:08.249 20:02:52 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:08.249 20:02:52 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:08.249 20:02:52 -- common/autotest_common.sh@10 -- # set +x 00:07:08.249 ************************************ 00:07:08.249 START TEST thread_spdk_lock 00:07:08.249 ************************************ 00:07:08.249 20:02:52 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:07:08.249 [2024-04-26 20:02:52.557710] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:08.249 [2024-04-26 20:02:52.557795] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1613943 ] 00:07:08.249 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.249 [2024-04-26 20:02:52.640923] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:08.507 [2024-04-26 20:02:52.724117] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.507 [2024-04-26 20:02:52.724120] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.079 [2024-04-26 20:02:53.217785] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 955:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:09.079 [2024-04-26 20:02:53.217820] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3062:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:07:09.079 [2024-04-26 20:02:53.217831] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3017:sspin_stacks_print: *ERROR*: spinlock 0x14b1a00 00:07:09.079 [2024-04-26 20:02:53.218757] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:09.079 [2024-04-26 20:02:53.218862] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1016:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:09.079 [2024-04-26 20:02:53.218885] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 850:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:07:09.079 Starting test contend 00:07:09.079 Worker Delay Wait us Hold us Total us 00:07:09.079 0 3 167156 189056 356212 00:07:09.079 1 5 85318 288357 373675 00:07:09.079 PASS test contend 00:07:09.079 Starting test hold_by_poller 00:07:09.079 PASS test hold_by_poller 00:07:09.079 Starting test hold_by_message 00:07:09.079 PASS test hold_by_message 00:07:09.079 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:07:09.079 100014 assertions passed 00:07:09.079 0 assertions failed 00:07:09.079 00:07:09.079 real 0m0.746s 00:07:09.079 user 0m1.134s 00:07:09.079 sys 0m0.102s 00:07:09.080 20:02:53 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:09.080 20:02:53 -- common/autotest_common.sh@10 -- # set +x 00:07:09.080 ************************************ 00:07:09.080 END TEST thread_spdk_lock 00:07:09.080 ************************************ 00:07:09.080 00:07:09.080 real 0m3.861s 00:07:09.080 user 0m3.653s 00:07:09.080 sys 0m0.681s 00:07:09.080 20:02:53 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:09.080 20:02:53 -- common/autotest_common.sh@10 -- # set +x 00:07:09.080 ************************************ 00:07:09.080 END TEST thread 00:07:09.080 ************************************ 00:07:09.080 20:02:53 -- spdk/autotest.sh@179 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:09.080 20:02:53 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:09.080 20:02:53 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:09.080 20:02:53 -- common/autotest_common.sh@10 -- # set +x 00:07:09.337 ************************************ 00:07:09.337 START TEST accel 00:07:09.337 ************************************ 00:07:09.337 20:02:53 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:07:09.337 * Looking for test storage... 00:07:09.337 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:09.337 20:02:53 -- accel/accel.sh@81 -- # declare -A expected_opcs 00:07:09.337 20:02:53 -- accel/accel.sh@82 -- # get_expected_opcs 00:07:09.338 20:02:53 -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:09.338 20:02:53 -- accel/accel.sh@62 -- # spdk_tgt_pid=1614182 00:07:09.338 20:02:53 -- accel/accel.sh@63 -- # waitforlisten 1614182 00:07:09.338 20:02:53 -- common/autotest_common.sh@827 -- # '[' -z 1614182 ']' 00:07:09.338 20:02:53 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:09.338 20:02:53 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:09.338 20:02:53 -- accel/accel.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:09.338 20:02:53 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:09.338 20:02:53 -- accel/accel.sh@61 -- # build_accel_config 00:07:09.338 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:09.338 20:02:53 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:09.338 20:02:53 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:09.338 20:02:53 -- common/autotest_common.sh@10 -- # set +x 00:07:09.338 20:02:53 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:09.338 20:02:53 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.338 20:02:53 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.338 20:02:53 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:09.338 20:02:53 -- accel/accel.sh@40 -- # local IFS=, 00:07:09.338 20:02:53 -- accel/accel.sh@41 -- # jq -r . 00:07:09.338 [2024-04-26 20:02:53.670299] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:09.338 [2024-04-26 20:02:53.670390] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614182 ] 00:07:09.338 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.338 [2024-04-26 20:02:53.756026] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:09.595 [2024-04-26 20:02:53.843988] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.161 20:02:54 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:10.161 20:02:54 -- common/autotest_common.sh@860 -- # return 0 00:07:10.161 20:02:54 -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:10.161 20:02:54 -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:10.161 20:02:54 -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:10.161 20:02:54 -- accel/accel.sh@68 -- # [[ -n '' ]] 00:07:10.161 20:02:54 -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:10.161 20:02:54 -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:10.161 20:02:54 -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:10.161 20:02:54 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:10.161 20:02:54 -- common/autotest_common.sh@10 -- # set +x 00:07:10.161 20:02:54 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # IFS== 00:07:10.161 20:02:54 -- accel/accel.sh@72 -- # read -r opc module 00:07:10.161 20:02:54 -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.161 20:02:54 -- accel/accel.sh@75 -- # killprocess 1614182 00:07:10.161 20:02:54 -- common/autotest_common.sh@946 -- # '[' -z 1614182 ']' 00:07:10.161 20:02:54 -- common/autotest_common.sh@950 -- # kill -0 1614182 00:07:10.161 20:02:54 -- common/autotest_common.sh@951 -- # uname 00:07:10.161 20:02:54 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:10.161 20:02:54 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1614182 00:07:10.161 20:02:54 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:10.161 20:02:54 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:10.161 20:02:54 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1614182' 00:07:10.161 killing process with pid 1614182 00:07:10.161 20:02:54 -- common/autotest_common.sh@965 -- # kill 1614182 00:07:10.161 20:02:54 -- common/autotest_common.sh@970 -- # wait 1614182 00:07:10.728 20:02:54 -- accel/accel.sh@76 -- # trap - ERR 00:07:10.728 20:02:54 -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:07:10.728 20:02:54 -- common/autotest_common.sh@1097 -- # '[' 3 -le 1 ']' 00:07:10.728 20:02:54 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:10.728 20:02:54 -- common/autotest_common.sh@10 -- # set +x 00:07:10.728 20:02:55 -- common/autotest_common.sh@1121 -- # accel_perf -h 00:07:10.728 20:02:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:07:10.728 20:02:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.728 20:02:55 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.728 20:02:55 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.728 20:02:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.728 20:02:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.728 20:02:55 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:10.728 20:02:55 -- accel/accel.sh@40 -- # local IFS=, 00:07:10.728 20:02:55 -- accel/accel.sh@41 -- # jq -r . 00:07:10.728 20:02:55 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:10.728 20:02:55 -- common/autotest_common.sh@10 -- # set +x 00:07:10.728 20:02:55 -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:07:10.728 20:02:55 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:10.728 20:02:55 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:10.728 20:02:55 -- common/autotest_common.sh@10 -- # set +x 00:07:10.986 ************************************ 00:07:10.986 START TEST accel_missing_filename 00:07:10.986 ************************************ 00:07:10.986 20:02:55 -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress 00:07:10.986 20:02:55 -- common/autotest_common.sh@648 -- # local es=0 00:07:10.986 20:02:55 -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:07:10.986 20:02:55 -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:10.986 20:02:55 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:10.986 20:02:55 -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:10.986 20:02:55 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:10.986 20:02:55 -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:07:10.986 20:02:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:07:10.986 20:02:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:10.986 20:02:55 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.986 20:02:55 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.986 20:02:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.986 20:02:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.986 20:02:55 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:10.986 20:02:55 -- accel/accel.sh@40 -- # local IFS=, 00:07:10.986 20:02:55 -- accel/accel.sh@41 -- # jq -r . 00:07:10.986 [2024-04-26 20:02:55.341562] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:10.986 [2024-04-26 20:02:55.341645] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614417 ] 00:07:10.986 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.243 [2024-04-26 20:02:55.430076] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.243 [2024-04-26 20:02:55.513362] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.243 [2024-04-26 20:02:55.559680] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:11.243 [2024-04-26 20:02:55.629041] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:07:11.502 A filename is required. 00:07:11.502 20:02:55 -- common/autotest_common.sh@651 -- # es=234 00:07:11.502 20:02:55 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:11.502 20:02:55 -- common/autotest_common.sh@660 -- # es=106 00:07:11.502 20:02:55 -- common/autotest_common.sh@661 -- # case "$es" in 00:07:11.502 20:02:55 -- common/autotest_common.sh@668 -- # es=1 00:07:11.502 20:02:55 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:11.502 00:07:11.502 real 0m0.387s 00:07:11.502 user 0m0.258s 00:07:11.502 sys 0m0.166s 00:07:11.502 20:02:55 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:11.502 20:02:55 -- common/autotest_common.sh@10 -- # set +x 00:07:11.502 ************************************ 00:07:11.502 END TEST accel_missing_filename 00:07:11.502 ************************************ 00:07:11.502 20:02:55 -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.502 20:02:55 -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:07:11.502 20:02:55 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:11.502 20:02:55 -- common/autotest_common.sh@10 -- # set +x 00:07:11.502 ************************************ 00:07:11.502 START TEST accel_compress_verify 00:07:11.502 ************************************ 00:07:11.502 20:02:55 -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.502 20:02:55 -- common/autotest_common.sh@648 -- # local es=0 00:07:11.502 20:02:55 -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.502 20:02:55 -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:11.502 20:02:55 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:11.502 20:02:55 -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:11.502 20:02:55 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:11.502 20:02:55 -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.502 20:02:55 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:11.502 20:02:55 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.502 20:02:55 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:11.502 20:02:55 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:11.502 20:02:55 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.502 20:02:55 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.502 20:02:55 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:11.502 20:02:55 -- accel/accel.sh@40 -- # local IFS=, 00:07:11.502 20:02:55 -- accel/accel.sh@41 -- # jq -r . 00:07:11.502 [2024-04-26 20:02:55.937174] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:11.502 [2024-04-26 20:02:55.937258] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614612 ] 00:07:11.760 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.760 [2024-04-26 20:02:56.022192] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:11.760 [2024-04-26 20:02:56.104332] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.760 [2024-04-26 20:02:56.150285] app.c: 966:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:12.018 [2024-04-26 20:02:56.219474] accel_perf.c:1394:main: *ERROR*: ERROR starting application 00:07:12.018 00:07:12.018 Compression does not support the verify option, aborting. 00:07:12.018 20:02:56 -- common/autotest_common.sh@651 -- # es=161 00:07:12.018 20:02:56 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:12.018 20:02:56 -- common/autotest_common.sh@660 -- # es=33 00:07:12.018 20:02:56 -- common/autotest_common.sh@661 -- # case "$es" in 00:07:12.018 20:02:56 -- common/autotest_common.sh@668 -- # es=1 00:07:12.018 20:02:56 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:12.018 00:07:12.018 real 0m0.382s 00:07:12.018 user 0m0.263s 00:07:12.018 sys 0m0.156s 00:07:12.018 20:02:56 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:12.018 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.018 ************************************ 00:07:12.018 END TEST accel_compress_verify 00:07:12.018 ************************************ 00:07:12.018 20:02:56 -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:07:12.018 20:02:56 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:12.018 20:02:56 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:12.018 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.277 ************************************ 00:07:12.277 START TEST accel_wrong_workload 00:07:12.277 ************************************ 00:07:12.277 20:02:56 -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w foobar 00:07:12.278 20:02:56 -- common/autotest_common.sh@648 -- # local es=0 00:07:12.278 20:02:56 -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:07:12.278 20:02:56 -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:12.278 20:02:56 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.278 20:02:56 -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:12.278 20:02:56 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.278 20:02:56 -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:07:12.278 20:02:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:07:12.278 20:02:56 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.278 20:02:56 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.278 20:02:56 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.278 20:02:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.278 20:02:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.278 20:02:56 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.278 20:02:56 -- accel/accel.sh@40 -- # local IFS=, 00:07:12.278 20:02:56 -- accel/accel.sh@41 -- # jq -r . 00:07:12.278 Unsupported workload type: foobar 00:07:12.278 [2024-04-26 20:02:56.531543] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:07:12.278 accel_perf options: 00:07:12.278 [-h help message] 00:07:12.278 [-q queue depth per core] 00:07:12.278 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:12.278 [-T number of threads per core 00:07:12.278 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:12.278 [-t time in seconds] 00:07:12.278 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:12.278 [ dif_verify, , dif_generate, dif_generate_copy 00:07:12.278 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:12.278 [-l for compress/decompress workloads, name of uncompressed input file 00:07:12.278 [-S for crc32c workload, use this seed value (default 0) 00:07:12.278 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:12.278 [-f for fill workload, use this BYTE value (default 255) 00:07:12.278 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:12.278 [-y verify result if this switch is on] 00:07:12.278 [-a tasks to allocate per core (default: same value as -q)] 00:07:12.278 Can be used to spread operations across a wider range of memory. 00:07:12.278 20:02:56 -- common/autotest_common.sh@651 -- # es=1 00:07:12.278 20:02:56 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:12.278 20:02:56 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:12.278 20:02:56 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:12.278 00:07:12.278 real 0m0.028s 00:07:12.278 user 0m0.012s 00:07:12.278 sys 0m0.017s 00:07:12.278 20:02:56 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:12.278 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.278 ************************************ 00:07:12.278 END TEST accel_wrong_workload 00:07:12.278 ************************************ 00:07:12.278 Error: writing output failed: Broken pipe 00:07:12.278 20:02:56 -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:07:12.278 20:02:56 -- common/autotest_common.sh@1097 -- # '[' 10 -le 1 ']' 00:07:12.278 20:02:56 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:12.278 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.536 ************************************ 00:07:12.536 START TEST accel_negative_buffers 00:07:12.536 ************************************ 00:07:12.536 20:02:56 -- common/autotest_common.sh@1121 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:07:12.536 20:02:56 -- common/autotest_common.sh@648 -- # local es=0 00:07:12.536 20:02:56 -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:07:12.536 20:02:56 -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:07:12.536 20:02:56 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.536 20:02:56 -- common/autotest_common.sh@640 -- # type -t accel_perf 00:07:12.536 20:02:56 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.536 20:02:56 -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:07:12.536 20:02:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:07:12.536 20:02:56 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.536 20:02:56 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.536 20:02:56 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.536 20:02:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.536 20:02:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.536 20:02:56 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.536 20:02:56 -- accel/accel.sh@40 -- # local IFS=, 00:07:12.536 20:02:56 -- accel/accel.sh@41 -- # jq -r . 00:07:12.536 -x option must be non-negative. 00:07:12.536 [2024-04-26 20:02:56.766312] app.c:1364:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:07:12.536 accel_perf options: 00:07:12.536 [-h help message] 00:07:12.536 [-q queue depth per core] 00:07:12.536 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:07:12.536 [-T number of threads per core 00:07:12.536 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:07:12.536 [-t time in seconds] 00:07:12.536 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:07:12.536 [ dif_verify, , dif_generate, dif_generate_copy 00:07:12.536 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:07:12.536 [-l for compress/decompress workloads, name of uncompressed input file 00:07:12.536 [-S for crc32c workload, use this seed value (default 0) 00:07:12.536 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:07:12.536 [-f for fill workload, use this BYTE value (default 255) 00:07:12.536 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:07:12.536 [-y verify result if this switch is on] 00:07:12.536 [-a tasks to allocate per core (default: same value as -q)] 00:07:12.536 Can be used to spread operations across a wider range of memory. 00:07:12.536 20:02:56 -- common/autotest_common.sh@651 -- # es=1 00:07:12.536 20:02:56 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:12.536 20:02:56 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:12.536 20:02:56 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:12.536 00:07:12.536 real 0m0.029s 00:07:12.536 user 0m0.013s 00:07:12.536 sys 0m0.017s 00:07:12.536 20:02:56 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:12.536 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.536 ************************************ 00:07:12.536 END TEST accel_negative_buffers 00:07:12.536 ************************************ 00:07:12.536 Error: writing output failed: Broken pipe 00:07:12.536 20:02:56 -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:07:12.536 20:02:56 -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:12.536 20:02:56 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:12.536 20:02:56 -- common/autotest_common.sh@10 -- # set +x 00:07:12.536 ************************************ 00:07:12.536 START TEST accel_crc32c 00:07:12.536 ************************************ 00:07:12.536 20:02:56 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -S 32 -y 00:07:12.536 20:02:56 -- accel/accel.sh@16 -- # local accel_opc 00:07:12.536 20:02:56 -- accel/accel.sh@17 -- # local accel_module 00:07:12.536 20:02:56 -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 20:02:56 -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 20:02:56 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:07:12.536 20:02:56 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:07:12.536 20:02:56 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.536 20:02:56 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.795 20:02:56 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.795 20:02:56 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.795 20:02:56 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.795 20:02:56 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:12.795 20:02:56 -- accel/accel.sh@40 -- # local IFS=, 00:07:12.795 20:02:56 -- accel/accel.sh@41 -- # jq -r . 00:07:12.795 [2024-04-26 20:02:56.994780] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:12.795 [2024-04-26 20:02:56.994859] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1614828 ] 00:07:12.795 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.795 [2024-04-26 20:02:57.078594] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.795 [2024-04-26 20:02:57.162028] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=0x1 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=crc32c 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=32 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=software 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@22 -- # accel_module=software 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=32 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=32 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val=1 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.795 20:02:57 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:12.795 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.795 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.796 20:02:57 -- accel/accel.sh@20 -- # val=Yes 00:07:12.796 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.796 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.796 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:12.796 20:02:57 -- accel/accel.sh@20 -- # val= 00:07:12.796 20:02:57 -- accel/accel.sh@21 -- # case "$var" in 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # IFS=: 00:07:12.796 20:02:57 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.169 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:14.169 20:02:58 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:14.169 20:02:58 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:14.169 00:07:14.169 real 0m1.371s 00:07:14.169 user 0m1.239s 00:07:14.169 sys 0m0.146s 00:07:14.169 20:02:58 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:14.169 20:02:58 -- common/autotest_common.sh@10 -- # set +x 00:07:14.169 ************************************ 00:07:14.169 END TEST accel_crc32c 00:07:14.169 ************************************ 00:07:14.169 20:02:58 -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:07:14.169 20:02:58 -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:14.169 20:02:58 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:14.169 20:02:58 -- common/autotest_common.sh@10 -- # set +x 00:07:14.169 ************************************ 00:07:14.169 START TEST accel_crc32c_C2 00:07:14.169 ************************************ 00:07:14.169 20:02:58 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w crc32c -y -C 2 00:07:14.169 20:02:58 -- accel/accel.sh@16 -- # local accel_opc 00:07:14.169 20:02:58 -- accel/accel.sh@17 -- # local accel_module 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.169 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.169 20:02:58 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:07:14.169 20:02:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:07:14.169 20:02:58 -- accel/accel.sh@12 -- # build_accel_config 00:07:14.169 20:02:58 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.169 20:02:58 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.169 20:02:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.169 20:02:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.169 20:02:58 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:14.169 20:02:58 -- accel/accel.sh@40 -- # local IFS=, 00:07:14.169 20:02:58 -- accel/accel.sh@41 -- # jq -r . 00:07:14.169 [2024-04-26 20:02:58.553267] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:14.169 [2024-04-26 20:02:58.553353] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615064 ] 00:07:14.169 EAL: No free 2048 kB hugepages reported on node 1 00:07:14.427 [2024-04-26 20:02:58.636087] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.427 [2024-04-26 20:02:58.717997] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=0x1 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=crc32c 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=0 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=software 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@22 -- # accel_module=software 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=32 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=32 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=1 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.427 20:02:58 -- accel/accel.sh@20 -- # val=Yes 00:07:14.427 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.427 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.428 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.428 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.428 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.428 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.428 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:14.428 20:02:58 -- accel/accel.sh@20 -- # val= 00:07:14.428 20:02:58 -- accel/accel.sh@21 -- # case "$var" in 00:07:14.428 20:02:58 -- accel/accel.sh@19 -- # IFS=: 00:07:14.428 20:02:58 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@20 -- # val= 00:07:15.802 20:02:59 -- accel/accel.sh@21 -- # case "$var" in 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:02:59 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:02:59 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:15.802 20:02:59 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:07:15.802 20:02:59 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:15.802 00:07:15.802 real 0m1.385s 00:07:15.802 user 0m1.254s 00:07:15.802 sys 0m0.143s 00:07:15.802 20:02:59 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:15.802 20:02:59 -- common/autotest_common.sh@10 -- # set +x 00:07:15.802 ************************************ 00:07:15.802 END TEST accel_crc32c_C2 00:07:15.802 ************************************ 00:07:15.802 20:02:59 -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:07:15.802 20:02:59 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:15.802 20:02:59 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:15.802 20:02:59 -- common/autotest_common.sh@10 -- # set +x 00:07:15.802 ************************************ 00:07:15.802 START TEST accel_copy 00:07:15.802 ************************************ 00:07:15.802 20:03:00 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy -y 00:07:15.802 20:03:00 -- accel/accel.sh@16 -- # local accel_opc 00:07:15.802 20:03:00 -- accel/accel.sh@17 -- # local accel_module 00:07:15.802 20:03:00 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:07:15.802 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:15.802 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:15.802 20:03:00 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:07:15.802 20:03:00 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.802 20:03:00 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:15.802 20:03:00 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:15.802 20:03:00 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.802 20:03:00 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.802 20:03:00 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:15.802 20:03:00 -- accel/accel.sh@40 -- # local IFS=, 00:07:15.802 20:03:00 -- accel/accel.sh@41 -- # jq -r . 00:07:15.802 [2024-04-26 20:03:00.116731] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:15.802 [2024-04-26 20:03:00.116793] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615284 ] 00:07:15.802 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.802 [2024-04-26 20:03:00.192131] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.061 [2024-04-26 20:03:00.275556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=0x1 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=copy 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@23 -- # accel_opc=copy 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=software 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@22 -- # accel_module=software 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=32 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=32 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=1 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val=Yes 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:16.061 20:03:00 -- accel/accel.sh@20 -- # val= 00:07:16.061 20:03:00 -- accel/accel.sh@21 -- # case "$var" in 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # IFS=: 00:07:16.061 20:03:00 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.436 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:17.436 20:03:01 -- accel/accel.sh@27 -- # [[ -n copy ]] 00:07:17.436 20:03:01 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:17.436 00:07:17.436 real 0m1.371s 00:07:17.436 user 0m1.242s 00:07:17.436 sys 0m0.138s 00:07:17.436 20:03:01 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:17.436 20:03:01 -- common/autotest_common.sh@10 -- # set +x 00:07:17.436 ************************************ 00:07:17.436 END TEST accel_copy 00:07:17.436 ************************************ 00:07:17.436 20:03:01 -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:17.436 20:03:01 -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:17.436 20:03:01 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:17.436 20:03:01 -- common/autotest_common.sh@10 -- # set +x 00:07:17.436 ************************************ 00:07:17.436 START TEST accel_fill 00:07:17.436 ************************************ 00:07:17.436 20:03:01 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:17.436 20:03:01 -- accel/accel.sh@16 -- # local accel_opc 00:07:17.436 20:03:01 -- accel/accel.sh@17 -- # local accel_module 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.436 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.436 20:03:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:17.436 20:03:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:07:17.436 20:03:01 -- accel/accel.sh@12 -- # build_accel_config 00:07:17.436 20:03:01 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:17.436 20:03:01 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:17.436 20:03:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:17.436 20:03:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:17.436 20:03:01 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:17.436 20:03:01 -- accel/accel.sh@40 -- # local IFS=, 00:07:17.436 20:03:01 -- accel/accel.sh@41 -- # jq -r . 00:07:17.436 [2024-04-26 20:03:01.698421] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:17.436 [2024-04-26 20:03:01.698507] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615584 ] 00:07:17.436 EAL: No free 2048 kB hugepages reported on node 1 00:07:17.436 [2024-04-26 20:03:01.784450] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:17.436 [2024-04-26 20:03:01.871916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=0x1 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=fill 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@23 -- # accel_opc=fill 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=0x80 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=software 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@22 -- # accel_module=software 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=64 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=64 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=1 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val=Yes 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:17.695 20:03:01 -- accel/accel.sh@20 -- # val= 00:07:17.695 20:03:01 -- accel/accel.sh@21 -- # case "$var" in 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # IFS=: 00:07:17.695 20:03:01 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.630 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:18.630 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.630 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.888 20:03:03 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:18.888 20:03:03 -- accel/accel.sh@27 -- # [[ -n fill ]] 00:07:18.888 20:03:03 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:18.888 00:07:18.888 real 0m1.396s 00:07:18.888 user 0m1.260s 00:07:18.888 sys 0m0.150s 00:07:18.888 20:03:03 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:18.888 20:03:03 -- common/autotest_common.sh@10 -- # set +x 00:07:18.888 ************************************ 00:07:18.888 END TEST accel_fill 00:07:18.888 ************************************ 00:07:18.888 20:03:03 -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:07:18.888 20:03:03 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:18.888 20:03:03 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:18.888 20:03:03 -- common/autotest_common.sh@10 -- # set +x 00:07:18.888 ************************************ 00:07:18.888 START TEST accel_copy_crc32c 00:07:18.888 ************************************ 00:07:18.888 20:03:03 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y 00:07:18.888 20:03:03 -- accel/accel.sh@16 -- # local accel_opc 00:07:18.888 20:03:03 -- accel/accel.sh@17 -- # local accel_module 00:07:18.888 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:18.888 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:18.888 20:03:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:07:18.888 20:03:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:07:18.888 20:03:03 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.888 20:03:03 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.888 20:03:03 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.888 20:03:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.888 20:03:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.888 20:03:03 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:18.888 20:03:03 -- accel/accel.sh@40 -- # local IFS=, 00:07:18.888 20:03:03 -- accel/accel.sh@41 -- # jq -r . 00:07:18.888 [2024-04-26 20:03:03.287904] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:18.888 [2024-04-26 20:03:03.287985] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1615805 ] 00:07:19.146 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.146 [2024-04-26 20:03:03.375249] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.146 [2024-04-26 20:03:03.458913] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=0x1 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=0 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=software 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@22 -- # accel_module=software 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=32 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=32 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=1 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val=Yes 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:19.146 20:03:03 -- accel/accel.sh@20 -- # val= 00:07:19.146 20:03:03 -- accel/accel.sh@21 -- # case "$var" in 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # IFS=: 00:07:19.146 20:03:03 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@20 -- # val= 00:07:20.582 20:03:04 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:20.582 20:03:04 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:20.582 20:03:04 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:20.582 00:07:20.582 real 0m1.379s 00:07:20.582 user 0m1.233s 00:07:20.582 sys 0m0.157s 00:07:20.582 20:03:04 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:20.582 20:03:04 -- common/autotest_common.sh@10 -- # set +x 00:07:20.582 ************************************ 00:07:20.582 END TEST accel_copy_crc32c 00:07:20.582 ************************************ 00:07:20.582 20:03:04 -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:07:20.582 20:03:04 -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:20.582 20:03:04 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:20.582 20:03:04 -- common/autotest_common.sh@10 -- # set +x 00:07:20.582 ************************************ 00:07:20.582 START TEST accel_copy_crc32c_C2 00:07:20.582 ************************************ 00:07:20.582 20:03:04 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:07:20.582 20:03:04 -- accel/accel.sh@16 -- # local accel_opc 00:07:20.582 20:03:04 -- accel/accel.sh@17 -- # local accel_module 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # IFS=: 00:07:20.582 20:03:04 -- accel/accel.sh@19 -- # read -r var val 00:07:20.582 20:03:04 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:07:20.582 20:03:04 -- accel/accel.sh@12 -- # build_accel_config 00:07:20.582 20:03:04 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:07:20.582 20:03:04 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:20.582 20:03:04 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:20.582 20:03:04 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.582 20:03:04 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.582 20:03:04 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:20.582 20:03:04 -- accel/accel.sh@40 -- # local IFS=, 00:07:20.582 20:03:04 -- accel/accel.sh@41 -- # jq -r . 00:07:20.582 [2024-04-26 20:03:04.863047] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:20.582 [2024-04-26 20:03:04.863127] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616347 ] 00:07:20.582 EAL: No free 2048 kB hugepages reported on node 1 00:07:20.582 [2024-04-26 20:03:04.951288] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.841 [2024-04-26 20:03:05.037690] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=0x1 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=copy_crc32c 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=0 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val='8192 bytes' 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=software 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@22 -- # accel_module=software 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=32 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=32 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=1 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val=Yes 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:20.841 20:03:05 -- accel/accel.sh@20 -- # val= 00:07:20.841 20:03:05 -- accel/accel.sh@21 -- # case "$var" in 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # IFS=: 00:07:20.841 20:03:05 -- accel/accel.sh@19 -- # read -r var val 00:07:21.774 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:21.774 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:21.774 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:21.774 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:21.774 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:21.774 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:21.774 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:21.774 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:21.774 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.032 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.032 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.032 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.032 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.032 20:03:06 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:22.032 20:03:06 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:07:22.032 20:03:06 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:22.032 00:07:22.032 real 0m1.379s 00:07:22.032 user 0m1.236s 00:07:22.032 sys 0m0.155s 00:07:22.032 20:03:06 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:22.032 20:03:06 -- common/autotest_common.sh@10 -- # set +x 00:07:22.032 ************************************ 00:07:22.032 END TEST accel_copy_crc32c_C2 00:07:22.032 ************************************ 00:07:22.032 20:03:06 -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:07:22.032 20:03:06 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:22.032 20:03:06 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:22.032 20:03:06 -- common/autotest_common.sh@10 -- # set +x 00:07:22.032 ************************************ 00:07:22.032 START TEST accel_dualcast 00:07:22.032 ************************************ 00:07:22.032 20:03:06 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dualcast -y 00:07:22.032 20:03:06 -- accel/accel.sh@16 -- # local accel_opc 00:07:22.032 20:03:06 -- accel/accel.sh@17 -- # local accel_module 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.032 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.032 20:03:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:07:22.032 20:03:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:07:22.032 20:03:06 -- accel/accel.sh@12 -- # build_accel_config 00:07:22.032 20:03:06 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.032 20:03:06 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.032 20:03:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.032 20:03:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.032 20:03:06 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:22.032 20:03:06 -- accel/accel.sh@40 -- # local IFS=, 00:07:22.032 20:03:06 -- accel/accel.sh@41 -- # jq -r . 00:07:22.032 [2024-04-26 20:03:06.437463] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:22.032 [2024-04-26 20:03:06.437546] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616766 ] 00:07:22.291 EAL: No free 2048 kB hugepages reported on node 1 00:07:22.291 [2024-04-26 20:03:06.523272] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.291 [2024-04-26 20:03:06.604473] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=0x1 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=dualcast 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@23 -- # accel_opc=dualcast 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=software 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@22 -- # accel_module=software 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=32 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=32 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=1 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val=Yes 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:22.291 20:03:06 -- accel/accel.sh@20 -- # val= 00:07:22.291 20:03:06 -- accel/accel.sh@21 -- # case "$var" in 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # IFS=: 00:07:22.291 20:03:06 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@20 -- # val= 00:07:23.667 20:03:07 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:07 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:07 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:23.667 20:03:07 -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:07:23.667 20:03:07 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:23.667 00:07:23.667 real 0m1.389s 00:07:23.667 user 0m1.250s 00:07:23.667 sys 0m0.151s 00:07:23.667 20:03:07 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:23.667 20:03:07 -- common/autotest_common.sh@10 -- # set +x 00:07:23.667 ************************************ 00:07:23.667 END TEST accel_dualcast 00:07:23.667 ************************************ 00:07:23.667 20:03:07 -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:07:23.667 20:03:07 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:23.667 20:03:07 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:23.667 20:03:07 -- common/autotest_common.sh@10 -- # set +x 00:07:23.667 ************************************ 00:07:23.667 START TEST accel_compare 00:07:23.667 ************************************ 00:07:23.667 20:03:08 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compare -y 00:07:23.667 20:03:08 -- accel/accel.sh@16 -- # local accel_opc 00:07:23.667 20:03:08 -- accel/accel.sh@17 -- # local accel_module 00:07:23.667 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.667 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.667 20:03:08 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:07:23.667 20:03:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:07:23.667 20:03:08 -- accel/accel.sh@12 -- # build_accel_config 00:07:23.667 20:03:08 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:23.667 20:03:08 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:23.667 20:03:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:23.667 20:03:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:23.667 20:03:08 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:23.667 20:03:08 -- accel/accel.sh@40 -- # local IFS=, 00:07:23.667 20:03:08 -- accel/accel.sh@41 -- # jq -r . 00:07:23.667 [2024-04-26 20:03:08.041753] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:23.667 [2024-04-26 20:03:08.041846] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616966 ] 00:07:23.667 EAL: No free 2048 kB hugepages reported on node 1 00:07:23.926 [2024-04-26 20:03:08.127577] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.926 [2024-04-26 20:03:08.207498] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=0x1 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=compare 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@23 -- # accel_opc=compare 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=software 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@22 -- # accel_module=software 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=32 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=32 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=1 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val=Yes 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:23.926 20:03:08 -- accel/accel.sh@20 -- # val= 00:07:23.926 20:03:08 -- accel/accel.sh@21 -- # case "$var" in 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # IFS=: 00:07:23.926 20:03:08 -- accel/accel.sh@19 -- # read -r var val 00:07:25.302 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.302 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.302 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.302 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.302 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.302 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.302 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.302 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.302 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.302 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.302 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.303 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.303 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.303 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.303 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.303 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.303 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.303 20:03:09 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:25.303 20:03:09 -- accel/accel.sh@27 -- # [[ -n compare ]] 00:07:25.303 20:03:09 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:25.303 00:07:25.303 real 0m1.375s 00:07:25.303 user 0m1.232s 00:07:25.303 sys 0m0.154s 00:07:25.303 20:03:09 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:25.303 20:03:09 -- common/autotest_common.sh@10 -- # set +x 00:07:25.303 ************************************ 00:07:25.303 END TEST accel_compare 00:07:25.303 ************************************ 00:07:25.303 20:03:09 -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:07:25.303 20:03:09 -- common/autotest_common.sh@1097 -- # '[' 7 -le 1 ']' 00:07:25.303 20:03:09 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:25.303 20:03:09 -- common/autotest_common.sh@10 -- # set +x 00:07:25.303 ************************************ 00:07:25.303 START TEST accel_xor 00:07:25.303 ************************************ 00:07:25.303 20:03:09 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y 00:07:25.303 20:03:09 -- accel/accel.sh@16 -- # local accel_opc 00:07:25.303 20:03:09 -- accel/accel.sh@17 -- # local accel_module 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.303 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.303 20:03:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:07:25.303 20:03:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:07:25.303 20:03:09 -- accel/accel.sh@12 -- # build_accel_config 00:07:25.303 20:03:09 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:25.303 20:03:09 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:25.303 20:03:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:25.303 20:03:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:25.303 20:03:09 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:25.303 20:03:09 -- accel/accel.sh@40 -- # local IFS=, 00:07:25.303 20:03:09 -- accel/accel.sh@41 -- # jq -r . 00:07:25.303 [2024-04-26 20:03:09.625309] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:25.303 [2024-04-26 20:03:09.625392] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617177 ] 00:07:25.303 EAL: No free 2048 kB hugepages reported on node 1 00:07:25.303 [2024-04-26 20:03:09.710524] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.561 [2024-04-26 20:03:09.794075] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=0x1 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=xor 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=2 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=software 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@22 -- # accel_module=software 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=32 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.561 20:03:09 -- accel/accel.sh@20 -- # val=32 00:07:25.561 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.561 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.562 20:03:09 -- accel/accel.sh@20 -- # val=1 00:07:25.562 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.562 20:03:09 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:25.562 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.562 20:03:09 -- accel/accel.sh@20 -- # val=Yes 00:07:25.562 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.562 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.562 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:25.562 20:03:09 -- accel/accel.sh@20 -- # val= 00:07:25.562 20:03:09 -- accel/accel.sh@21 -- # case "$var" in 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # IFS=: 00:07:25.562 20:03:09 -- accel/accel.sh@19 -- # read -r var val 00:07:26.936 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.936 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.936 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.936 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.936 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.936 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.936 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.937 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.937 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.937 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:10 -- accel/accel.sh@20 -- # val= 00:07:26.937 20:03:10 -- accel/accel.sh@21 -- # case "$var" in 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:10 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:10 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:26.937 20:03:10 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:26.937 20:03:10 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:26.937 00:07:26.937 real 0m1.391s 00:07:26.937 user 0m1.252s 00:07:26.937 sys 0m0.150s 00:07:26.937 20:03:10 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:26.937 20:03:10 -- common/autotest_common.sh@10 -- # set +x 00:07:26.937 ************************************ 00:07:26.937 END TEST accel_xor 00:07:26.937 ************************************ 00:07:26.937 20:03:11 -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:07:26.937 20:03:11 -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:26.937 20:03:11 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:26.937 20:03:11 -- common/autotest_common.sh@10 -- # set +x 00:07:26.937 ************************************ 00:07:26.937 START TEST accel_xor 00:07:26.937 ************************************ 00:07:26.937 20:03:11 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w xor -y -x 3 00:07:26.937 20:03:11 -- accel/accel.sh@16 -- # local accel_opc 00:07:26.937 20:03:11 -- accel/accel.sh@17 -- # local accel_module 00:07:26.937 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:26.937 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:26.937 20:03:11 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:07:26.937 20:03:11 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:07:26.937 20:03:11 -- accel/accel.sh@12 -- # build_accel_config 00:07:26.937 20:03:11 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:26.937 20:03:11 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:26.937 20:03:11 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:26.937 20:03:11 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:26.937 20:03:11 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:26.937 20:03:11 -- accel/accel.sh@40 -- # local IFS=, 00:07:26.937 20:03:11 -- accel/accel.sh@41 -- # jq -r . 00:07:26.937 [2024-04-26 20:03:11.229033] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:26.937 [2024-04-26 20:03:11.229118] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617382 ] 00:07:26.937 EAL: No free 2048 kB hugepages reported on node 1 00:07:26.937 [2024-04-26 20:03:11.315715] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.204 [2024-04-26 20:03:11.400685] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val=0x1 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val=xor 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@23 -- # accel_opc=xor 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.204 20:03:11 -- accel/accel.sh@20 -- # val=3 00:07:27.204 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.204 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val=software 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@22 -- # accel_module=software 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val=32 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val=32 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val=1 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.205 20:03:11 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:27.205 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.205 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.206 20:03:11 -- accel/accel.sh@20 -- # val=Yes 00:07:27.206 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.206 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.206 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:27.206 20:03:11 -- accel/accel.sh@20 -- # val= 00:07:27.206 20:03:11 -- accel/accel.sh@21 -- # case "$var" in 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # IFS=: 00:07:27.206 20:03:11 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@20 -- # val= 00:07:28.587 20:03:12 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:28.587 20:03:12 -- accel/accel.sh@27 -- # [[ -n xor ]] 00:07:28.587 20:03:12 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:28.587 00:07:28.587 real 0m1.393s 00:07:28.587 user 0m1.256s 00:07:28.587 sys 0m0.148s 00:07:28.587 20:03:12 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:28.587 20:03:12 -- common/autotest_common.sh@10 -- # set +x 00:07:28.587 ************************************ 00:07:28.587 END TEST accel_xor 00:07:28.587 ************************************ 00:07:28.587 20:03:12 -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:07:28.587 20:03:12 -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:28.587 20:03:12 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:28.587 20:03:12 -- common/autotest_common.sh@10 -- # set +x 00:07:28.587 ************************************ 00:07:28.587 START TEST accel_dif_verify 00:07:28.587 ************************************ 00:07:28.587 20:03:12 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_verify 00:07:28.587 20:03:12 -- accel/accel.sh@16 -- # local accel_opc 00:07:28.587 20:03:12 -- accel/accel.sh@17 -- # local accel_module 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # IFS=: 00:07:28.587 20:03:12 -- accel/accel.sh@19 -- # read -r var val 00:07:28.587 20:03:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:07:28.587 20:03:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:07:28.587 20:03:12 -- accel/accel.sh@12 -- # build_accel_config 00:07:28.587 20:03:12 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:28.587 20:03:12 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:28.587 20:03:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:28.587 20:03:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:28.587 20:03:12 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:28.587 20:03:12 -- accel/accel.sh@40 -- # local IFS=, 00:07:28.587 20:03:12 -- accel/accel.sh@41 -- # jq -r . 00:07:28.587 [2024-04-26 20:03:12.815931] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:28.587 [2024-04-26 20:03:12.816020] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617642 ] 00:07:28.587 EAL: No free 2048 kB hugepages reported on node 1 00:07:28.587 [2024-04-26 20:03:12.900839] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:28.587 [2024-04-26 20:03:12.984616] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=0x1 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=dif_verify 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=software 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@22 -- # accel_module=software 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=32 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=32 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=1 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val=No 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:28.845 20:03:13 -- accel/accel.sh@20 -- # val= 00:07:28.845 20:03:13 -- accel/accel.sh@21 -- # case "$var" in 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # IFS=: 00:07:28.845 20:03:13 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:29.780 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:29.780 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:29.780 20:03:14 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:29.780 20:03:14 -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:07:29.780 20:03:14 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:29.780 00:07:29.780 real 0m1.390s 00:07:29.780 user 0m1.254s 00:07:29.780 sys 0m0.150s 00:07:29.780 20:03:14 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:29.780 20:03:14 -- common/autotest_common.sh@10 -- # set +x 00:07:29.780 ************************************ 00:07:29.780 END TEST accel_dif_verify 00:07:29.780 ************************************ 00:07:30.038 20:03:14 -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:07:30.038 20:03:14 -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:30.038 20:03:14 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:30.038 20:03:14 -- common/autotest_common.sh@10 -- # set +x 00:07:30.038 ************************************ 00:07:30.038 START TEST accel_dif_generate 00:07:30.038 ************************************ 00:07:30.038 20:03:14 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate 00:07:30.038 20:03:14 -- accel/accel.sh@16 -- # local accel_opc 00:07:30.038 20:03:14 -- accel/accel.sh@17 -- # local accel_module 00:07:30.038 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.038 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.038 20:03:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:07:30.038 20:03:14 -- accel/accel.sh@12 -- # build_accel_config 00:07:30.038 20:03:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:07:30.038 20:03:14 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:30.038 20:03:14 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:30.038 20:03:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:30.038 20:03:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:30.038 20:03:14 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:30.038 20:03:14 -- accel/accel.sh@40 -- # local IFS=, 00:07:30.038 20:03:14 -- accel/accel.sh@41 -- # jq -r . 00:07:30.038 [2024-04-26 20:03:14.394493] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:30.038 [2024-04-26 20:03:14.394572] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1617951 ] 00:07:30.038 EAL: No free 2048 kB hugepages reported on node 1 00:07:30.038 [2024-04-26 20:03:14.476675] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:30.297 [2024-04-26 20:03:14.555425] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val=0x1 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val=dif_generate 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.297 20:03:14 -- accel/accel.sh@20 -- # val='512 bytes' 00:07:30.297 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.297 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val='8 bytes' 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val=software 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@22 -- # accel_module=software 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val=32 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val=32 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val=1 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val=No 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:30.298 20:03:14 -- accel/accel.sh@20 -- # val= 00:07:30.298 20:03:14 -- accel/accel.sh@21 -- # case "$var" in 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # IFS=: 00:07:30.298 20:03:14 -- accel/accel.sh@19 -- # read -r var val 00:07:31.673 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.673 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.673 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.673 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.673 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.673 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.673 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.673 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.674 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.674 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.674 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@20 -- # val= 00:07:31.674 20:03:15 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:31.674 20:03:15 -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:07:31.674 20:03:15 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:31.674 00:07:31.674 real 0m1.384s 00:07:31.674 user 0m1.251s 00:07:31.674 sys 0m0.145s 00:07:31.674 20:03:15 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:31.674 20:03:15 -- common/autotest_common.sh@10 -- # set +x 00:07:31.674 ************************************ 00:07:31.674 END TEST accel_dif_generate 00:07:31.674 ************************************ 00:07:31.674 20:03:15 -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:07:31.674 20:03:15 -- common/autotest_common.sh@1097 -- # '[' 6 -le 1 ']' 00:07:31.674 20:03:15 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:31.674 20:03:15 -- common/autotest_common.sh@10 -- # set +x 00:07:31.674 ************************************ 00:07:31.674 START TEST accel_dif_generate_copy 00:07:31.674 ************************************ 00:07:31.674 20:03:15 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w dif_generate_copy 00:07:31.674 20:03:15 -- accel/accel.sh@16 -- # local accel_opc 00:07:31.674 20:03:15 -- accel/accel.sh@17 -- # local accel_module 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # IFS=: 00:07:31.674 20:03:15 -- accel/accel.sh@19 -- # read -r var val 00:07:31.674 20:03:15 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:07:31.674 20:03:15 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:07:31.674 20:03:15 -- accel/accel.sh@12 -- # build_accel_config 00:07:31.674 20:03:15 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:31.674 20:03:15 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:31.674 20:03:15 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:31.674 20:03:15 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:31.674 20:03:15 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:31.674 20:03:15 -- accel/accel.sh@40 -- # local IFS=, 00:07:31.674 20:03:15 -- accel/accel.sh@41 -- # jq -r . 00:07:31.674 [2024-04-26 20:03:15.993404] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:31.674 [2024-04-26 20:03:15.993490] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618153 ] 00:07:31.674 EAL: No free 2048 kB hugepages reported on node 1 00:07:31.674 [2024-04-26 20:03:16.080579] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:31.933 [2024-04-26 20:03:16.168703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=0x1 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=dif_generate_copy 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=software 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@22 -- # accel_module=software 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=32 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=32 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=1 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val=No 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:31.933 20:03:16 -- accel/accel.sh@20 -- # val= 00:07:31.933 20:03:16 -- accel/accel.sh@21 -- # case "$var" in 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # IFS=: 00:07:31.933 20:03:16 -- accel/accel.sh@19 -- # read -r var val 00:07:33.312 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.312 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.312 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.312 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.312 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.312 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.312 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.312 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.312 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.312 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.312 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.313 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.313 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.313 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.313 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.313 20:03:17 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:33.313 20:03:17 -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:07:33.313 20:03:17 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:33.313 00:07:33.313 real 0m1.400s 00:07:33.313 user 0m1.248s 00:07:33.313 sys 0m0.164s 00:07:33.313 20:03:17 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:33.313 20:03:17 -- common/autotest_common.sh@10 -- # set +x 00:07:33.313 ************************************ 00:07:33.313 END TEST accel_dif_generate_copy 00:07:33.313 ************************************ 00:07:33.313 20:03:17 -- accel/accel.sh@115 -- # [[ y == y ]] 00:07:33.313 20:03:17 -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.313 20:03:17 -- common/autotest_common.sh@1097 -- # '[' 8 -le 1 ']' 00:07:33.313 20:03:17 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:33.313 20:03:17 -- common/autotest_common.sh@10 -- # set +x 00:07:33.313 ************************************ 00:07:33.313 START TEST accel_comp 00:07:33.313 ************************************ 00:07:33.313 20:03:17 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.313 20:03:17 -- accel/accel.sh@16 -- # local accel_opc 00:07:33.313 20:03:17 -- accel/accel.sh@17 -- # local accel_module 00:07:33.313 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.313 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.313 20:03:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.313 20:03:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.313 20:03:17 -- accel/accel.sh@12 -- # build_accel_config 00:07:33.313 20:03:17 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:33.313 20:03:17 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:33.313 20:03:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:33.313 20:03:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:33.313 20:03:17 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:33.313 20:03:17 -- accel/accel.sh@40 -- # local IFS=, 00:07:33.313 20:03:17 -- accel/accel.sh@41 -- # jq -r . 00:07:33.313 [2024-04-26 20:03:17.588528] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:33.313 [2024-04-26 20:03:17.588600] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618358 ] 00:07:33.313 EAL: No free 2048 kB hugepages reported on node 1 00:07:33.313 [2024-04-26 20:03:17.672963] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:33.573 [2024-04-26 20:03:17.757624] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=0x1 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=compress 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@23 -- # accel_opc=compress 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=software 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@22 -- # accel_module=software 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=32 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=32 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=1 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val=No 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:33.573 20:03:17 -- accel/accel.sh@20 -- # val= 00:07:33.573 20:03:17 -- accel/accel.sh@21 -- # case "$var" in 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # IFS=: 00:07:33.573 20:03:17 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@20 -- # val= 00:07:34.950 20:03:18 -- accel/accel.sh@21 -- # case "$var" in 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # IFS=: 00:07:34.950 20:03:18 -- accel/accel.sh@19 -- # read -r var val 00:07:34.950 20:03:18 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:34.950 20:03:18 -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:34.950 20:03:18 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:34.950 00:07:34.950 real 0m1.393s 00:07:34.950 user 0m1.259s 00:07:34.950 sys 0m0.148s 00:07:34.950 20:03:18 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:34.950 20:03:18 -- common/autotest_common.sh@10 -- # set +x 00:07:34.950 ************************************ 00:07:34.950 END TEST accel_comp 00:07:34.950 ************************************ 00:07:34.950 20:03:19 -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:34.950 20:03:19 -- common/autotest_common.sh@1097 -- # '[' 9 -le 1 ']' 00:07:34.950 20:03:19 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:34.950 20:03:19 -- common/autotest_common.sh@10 -- # set +x 00:07:34.950 ************************************ 00:07:34.950 START TEST accel_decomp 00:07:34.951 ************************************ 00:07:34.951 20:03:19 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:34.951 20:03:19 -- accel/accel.sh@16 -- # local accel_opc 00:07:34.951 20:03:19 -- accel/accel.sh@17 -- # local accel_module 00:07:34.951 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:34.951 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:34.951 20:03:19 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:34.951 20:03:19 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:34.951 20:03:19 -- accel/accel.sh@12 -- # build_accel_config 00:07:34.951 20:03:19 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:34.951 20:03:19 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:34.951 20:03:19 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:34.951 20:03:19 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:34.951 20:03:19 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:34.951 20:03:19 -- accel/accel.sh@40 -- # local IFS=, 00:07:34.951 20:03:19 -- accel/accel.sh@41 -- # jq -r . 00:07:34.951 [2024-04-26 20:03:19.187362] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:34.951 [2024-04-26 20:03:19.187448] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618560 ] 00:07:34.951 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.951 [2024-04-26 20:03:19.274088] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.951 [2024-04-26 20:03:19.354134] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=0x1 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=decompress 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=software 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@22 -- # accel_module=software 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=32 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=32 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=1 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val=Yes 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:35.210 20:03:19 -- accel/accel.sh@20 -- # val= 00:07:35.210 20:03:19 -- accel/accel.sh@21 -- # case "$var" in 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # IFS=: 00:07:35.210 20:03:19 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.147 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.147 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.147 20:03:20 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:36.147 20:03:20 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:36.147 20:03:20 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:36.147 00:07:36.147 real 0m1.384s 00:07:36.147 user 0m1.249s 00:07:36.147 sys 0m0.149s 00:07:36.147 20:03:20 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:36.147 20:03:20 -- common/autotest_common.sh@10 -- # set +x 00:07:36.147 ************************************ 00:07:36.147 END TEST accel_decomp 00:07:36.147 ************************************ 00:07:36.147 20:03:20 -- accel/accel.sh@118 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:36.147 20:03:20 -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:36.147 20:03:20 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:36.147 20:03:20 -- common/autotest_common.sh@10 -- # set +x 00:07:36.406 ************************************ 00:07:36.406 START TEST accel_decmop_full 00:07:36.406 ************************************ 00:07:36.406 20:03:20 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:36.406 20:03:20 -- accel/accel.sh@16 -- # local accel_opc 00:07:36.406 20:03:20 -- accel/accel.sh@17 -- # local accel_module 00:07:36.406 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.406 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.406 20:03:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:36.406 20:03:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:36.406 20:03:20 -- accel/accel.sh@12 -- # build_accel_config 00:07:36.406 20:03:20 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:36.406 20:03:20 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:36.406 20:03:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:36.406 20:03:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:36.406 20:03:20 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:36.406 20:03:20 -- accel/accel.sh@40 -- # local IFS=, 00:07:36.406 20:03:20 -- accel/accel.sh@41 -- # jq -r . 00:07:36.406 [2024-04-26 20:03:20.755554] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:36.406 [2024-04-26 20:03:20.755655] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618772 ] 00:07:36.406 EAL: No free 2048 kB hugepages reported on node 1 00:07:36.406 [2024-04-26 20:03:20.841198] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.665 [2024-04-26 20:03:20.924862] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=0x1 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=decompress 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=software 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@22 -- # accel_module=software 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=32 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.665 20:03:20 -- accel/accel.sh@20 -- # val=32 00:07:36.665 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.665 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.666 20:03:20 -- accel/accel.sh@20 -- # val=1 00:07:36.666 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.666 20:03:20 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:36.666 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.666 20:03:20 -- accel/accel.sh@20 -- # val=Yes 00:07:36.666 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.666 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.666 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:36.666 20:03:20 -- accel/accel.sh@20 -- # val= 00:07:36.666 20:03:20 -- accel/accel.sh@21 -- # case "$var" in 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # IFS=: 00:07:36.666 20:03:20 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.063 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:38.063 20:03:22 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:38.063 20:03:22 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:38.063 00:07:38.063 real 0m1.396s 00:07:38.063 user 0m1.255s 00:07:38.063 sys 0m0.153s 00:07:38.063 20:03:22 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:38.063 20:03:22 -- common/autotest_common.sh@10 -- # set +x 00:07:38.063 ************************************ 00:07:38.063 END TEST accel_decmop_full 00:07:38.063 ************************************ 00:07:38.063 20:03:22 -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.063 20:03:22 -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:38.063 20:03:22 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:38.063 20:03:22 -- common/autotest_common.sh@10 -- # set +x 00:07:38.063 ************************************ 00:07:38.063 START TEST accel_decomp_mcore 00:07:38.063 ************************************ 00:07:38.063 20:03:22 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.063 20:03:22 -- accel/accel.sh@16 -- # local accel_opc 00:07:38.063 20:03:22 -- accel/accel.sh@17 -- # local accel_module 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.063 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.063 20:03:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.063 20:03:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:38.063 20:03:22 -- accel/accel.sh@12 -- # build_accel_config 00:07:38.063 20:03:22 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:38.063 20:03:22 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:38.063 20:03:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:38.063 20:03:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:38.063 20:03:22 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:38.063 20:03:22 -- accel/accel.sh@40 -- # local IFS=, 00:07:38.063 20:03:22 -- accel/accel.sh@41 -- # jq -r . 00:07:38.063 [2024-04-26 20:03:22.326430] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:38.063 [2024-04-26 20:03:22.326513] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619063 ] 00:07:38.063 EAL: No free 2048 kB hugepages reported on node 1 00:07:38.063 [2024-04-26 20:03:22.411696] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:38.063 [2024-04-26 20:03:22.495809] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.063 [2024-04-26 20:03:22.495906] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:38.063 [2024-04-26 20:03:22.495942] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:38.063 [2024-04-26 20:03:22.495945] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val=0xf 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val=decompress 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.322 20:03:22 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.322 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.322 20:03:22 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:38.322 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=software 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@22 -- # accel_module=software 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=32 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=32 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=1 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val=Yes 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:38.323 20:03:22 -- accel/accel.sh@20 -- # val= 00:07:38.323 20:03:22 -- accel/accel.sh@21 -- # case "$var" in 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # IFS=: 00:07:38.323 20:03:22 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:23 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:39.697 20:03:23 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:39.697 20:03:23 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:39.697 00:07:39.697 real 0m1.406s 00:07:39.697 user 0m4.634s 00:07:39.697 sys 0m0.169s 00:07:39.697 20:03:23 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:39.697 20:03:23 -- common/autotest_common.sh@10 -- # set +x 00:07:39.697 ************************************ 00:07:39.697 END TEST accel_decomp_mcore 00:07:39.697 ************************************ 00:07:39.697 20:03:23 -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.697 20:03:23 -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:39.697 20:03:23 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:39.697 20:03:23 -- common/autotest_common.sh@10 -- # set +x 00:07:39.697 ************************************ 00:07:39.697 START TEST accel_decomp_full_mcore 00:07:39.697 ************************************ 00:07:39.697 20:03:23 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.697 20:03:23 -- accel/accel.sh@16 -- # local accel_opc 00:07:39.697 20:03:23 -- accel/accel.sh@17 -- # local accel_module 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:23 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:23 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.697 20:03:23 -- accel/accel.sh@12 -- # build_accel_config 00:07:39.697 20:03:23 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:39.697 20:03:23 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:39.697 20:03:23 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:39.697 20:03:23 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:39.697 20:03:23 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:39.697 20:03:23 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:39.697 20:03:23 -- accel/accel.sh@40 -- # local IFS=, 00:07:39.697 20:03:23 -- accel/accel.sh@41 -- # jq -r . 00:07:39.697 [2024-04-26 20:03:23.919587] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:39.697 [2024-04-26 20:03:23.919669] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619336 ] 00:07:39.697 EAL: No free 2048 kB hugepages reported on node 1 00:07:39.697 [2024-04-26 20:03:24.002437] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:39.697 [2024-04-26 20:03:24.086630] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:39.697 [2024-04-26 20:03:24.086719] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:39.697 [2024-04-26 20:03:24.086798] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:39.697 [2024-04-26 20:03:24.086800] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.697 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.697 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.697 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.697 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.697 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=0xf 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=decompress 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=software 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@22 -- # accel_module=software 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=32 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=32 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=1 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val=Yes 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:39.954 20:03:24 -- accel/accel.sh@20 -- # val= 00:07:39.954 20:03:24 -- accel/accel.sh@21 -- # case "$var" in 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # IFS=: 00:07:39.954 20:03:24 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.887 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.887 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.887 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.888 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.888 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:40.888 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:40.888 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:40.888 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:40.888 20:03:25 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:40.888 20:03:25 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:40.888 20:03:25 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:40.888 00:07:40.888 real 0m1.412s 00:07:40.888 user 0m4.663s 00:07:40.888 sys 0m0.166s 00:07:40.888 20:03:25 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:40.888 20:03:25 -- common/autotest_common.sh@10 -- # set +x 00:07:40.888 ************************************ 00:07:40.888 END TEST accel_decomp_full_mcore 00:07:40.888 ************************************ 00:07:41.146 20:03:25 -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:41.146 20:03:25 -- common/autotest_common.sh@1097 -- # '[' 11 -le 1 ']' 00:07:41.146 20:03:25 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:41.146 20:03:25 -- common/autotest_common.sh@10 -- # set +x 00:07:41.146 ************************************ 00:07:41.146 START TEST accel_decomp_mthread 00:07:41.146 ************************************ 00:07:41.146 20:03:25 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:41.146 20:03:25 -- accel/accel.sh@16 -- # local accel_opc 00:07:41.146 20:03:25 -- accel/accel.sh@17 -- # local accel_module 00:07:41.146 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.146 20:03:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:41.146 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.146 20:03:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:41.146 20:03:25 -- accel/accel.sh@12 -- # build_accel_config 00:07:41.146 20:03:25 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:41.146 20:03:25 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:41.146 20:03:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:41.146 20:03:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:41.146 20:03:25 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:41.146 20:03:25 -- accel/accel.sh@40 -- # local IFS=, 00:07:41.146 20:03:25 -- accel/accel.sh@41 -- # jq -r . 00:07:41.146 [2024-04-26 20:03:25.533651] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:41.146 [2024-04-26 20:03:25.533731] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619551 ] 00:07:41.146 EAL: No free 2048 kB hugepages reported on node 1 00:07:41.405 [2024-04-26 20:03:25.616396] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:41.405 [2024-04-26 20:03:25.694394] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=0x1 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=decompress 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=software 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@22 -- # accel_module=software 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=32 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=32 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val=2 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.405 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.405 20:03:25 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:41.405 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.406 20:03:25 -- accel/accel.sh@20 -- # val=Yes 00:07:41.406 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.406 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.406 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:41.406 20:03:25 -- accel/accel.sh@20 -- # val= 00:07:41.406 20:03:25 -- accel/accel.sh@21 -- # case "$var" in 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # IFS=: 00:07:41.406 20:03:25 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@20 -- # val= 00:07:42.784 20:03:26 -- accel/accel.sh@21 -- # case "$var" in 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:26 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:26 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:42.784 20:03:26 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:42.784 20:03:26 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:42.784 00:07:42.784 real 0m1.369s 00:07:42.784 user 0m1.233s 00:07:42.784 sys 0m0.148s 00:07:42.784 20:03:26 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:42.784 20:03:26 -- common/autotest_common.sh@10 -- # set +x 00:07:42.784 ************************************ 00:07:42.784 END TEST accel_decomp_mthread 00:07:42.784 ************************************ 00:07:42.784 20:03:26 -- accel/accel.sh@122 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.784 20:03:26 -- common/autotest_common.sh@1097 -- # '[' 13 -le 1 ']' 00:07:42.784 20:03:26 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:42.784 20:03:26 -- common/autotest_common.sh@10 -- # set +x 00:07:42.784 ************************************ 00:07:42.784 START TEST accel_deomp_full_mthread 00:07:42.784 ************************************ 00:07:42.784 20:03:27 -- common/autotest_common.sh@1121 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.784 20:03:27 -- accel/accel.sh@16 -- # local accel_opc 00:07:42.784 20:03:27 -- accel/accel.sh@17 -- # local accel_module 00:07:42.784 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:42.784 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:42.784 20:03:27 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.784 20:03:27 -- accel/accel.sh@12 -- # build_accel_config 00:07:42.784 20:03:27 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:42.784 20:03:27 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:42.784 20:03:27 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:42.784 20:03:27 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:42.784 20:03:27 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:42.784 20:03:27 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:42.784 20:03:27 -- accel/accel.sh@40 -- # local IFS=, 00:07:42.784 20:03:27 -- accel/accel.sh@41 -- # jq -r . 00:07:42.784 [2024-04-26 20:03:27.104092] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:42.784 [2024-04-26 20:03:27.104167] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619749 ] 00:07:42.784 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.784 [2024-04-26 20:03:27.188142] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.043 [2024-04-26 20:03:27.271688] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=0x1 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=decompress 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=software 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@22 -- # accel_module=software 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=32 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=32 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=2 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val='1 seconds' 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val=Yes 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:43.043 20:03:27 -- accel/accel.sh@20 -- # val= 00:07:43.043 20:03:27 -- accel/accel.sh@21 -- # case "$var" in 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # IFS=: 00:07:43.043 20:03:27 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@20 -- # val= 00:07:44.421 20:03:28 -- accel/accel.sh@21 -- # case "$var" in 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # IFS=: 00:07:44.421 20:03:28 -- accel/accel.sh@19 -- # read -r var val 00:07:44.421 20:03:28 -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:44.421 20:03:28 -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:44.421 20:03:28 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:44.421 00:07:44.421 real 0m1.412s 00:07:44.421 user 0m1.269s 00:07:44.421 sys 0m0.154s 00:07:44.421 20:03:28 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:44.421 20:03:28 -- common/autotest_common.sh@10 -- # set +x 00:07:44.421 ************************************ 00:07:44.421 END TEST accel_deomp_full_mthread 00:07:44.421 ************************************ 00:07:44.421 20:03:28 -- accel/accel.sh@124 -- # [[ n == y ]] 00:07:44.421 20:03:28 -- accel/accel.sh@137 -- # build_accel_config 00:07:44.421 20:03:28 -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:44.421 20:03:28 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:44.421 20:03:28 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:44.421 20:03:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:44.421 20:03:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:44.421 20:03:28 -- common/autotest_common.sh@1097 -- # '[' 4 -le 1 ']' 00:07:44.421 20:03:28 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:44.421 20:03:28 -- accel/accel.sh@40 -- # local IFS=, 00:07:44.421 20:03:28 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:44.421 20:03:28 -- accel/accel.sh@41 -- # jq -r . 00:07:44.421 20:03:28 -- common/autotest_common.sh@10 -- # set +x 00:07:44.421 ************************************ 00:07:44.421 START TEST accel_dif_functional_tests 00:07:44.421 ************************************ 00:07:44.421 20:03:28 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:44.421 [2024-04-26 20:03:28.696333] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:44.422 [2024-04-26 20:03:28.696412] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1619959 ] 00:07:44.422 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.422 [2024-04-26 20:03:28.779560] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:44.422 [2024-04-26 20:03:28.861833] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:44.422 [2024-04-26 20:03:28.861923] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:44.422 [2024-04-26 20:03:28.861925] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.681 00:07:44.681 00:07:44.681 CUnit - A unit testing framework for C - Version 2.1-3 00:07:44.681 http://cunit.sourceforge.net/ 00:07:44.681 00:07:44.681 00:07:44.681 Suite: accel_dif 00:07:44.681 Test: verify: DIF generated, GUARD check ...passed 00:07:44.681 Test: verify: DIF generated, APPTAG check ...passed 00:07:44.681 Test: verify: DIF generated, REFTAG check ...passed 00:07:44.681 Test: verify: DIF not generated, GUARD check ...[2024-04-26 20:03:28.940819] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:44.681 [2024-04-26 20:03:28.940870] dif.c: 828:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:44.681 passed 00:07:44.681 Test: verify: DIF not generated, APPTAG check ...[2024-04-26 20:03:28.940909] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:44.681 [2024-04-26 20:03:28.940929] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:44.681 passed 00:07:44.681 Test: verify: DIF not generated, REFTAG check ...[2024-04-26 20:03:28.940950] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:44.681 [2024-04-26 20:03:28.940970] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:44.681 passed 00:07:44.681 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:44.681 Test: verify: APPTAG incorrect, APPTAG check ...[2024-04-26 20:03:28.941015] dif.c: 843:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:44.681 passed 00:07:44.681 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:44.681 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:44.681 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:44.681 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-04-26 20:03:28.941119] dif.c: 778:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:44.681 passed 00:07:44.681 Test: generate copy: DIF generated, GUARD check ...passed 00:07:44.681 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:44.681 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:44.681 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:44.681 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:44.681 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:44.681 Test: generate copy: iovecs-len validate ...[2024-04-26 20:03:28.941295] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:44.681 passed 00:07:44.681 Test: generate copy: buffer alignment validate ...passed 00:07:44.681 00:07:44.681 Run Summary: Type Total Ran Passed Failed Inactive 00:07:44.681 suites 1 1 n/a 0 0 00:07:44.681 tests 20 20 20 0 0 00:07:44.681 asserts 204 204 204 0 n/a 00:07:44.681 00:07:44.681 Elapsed time = 0.002 seconds 00:07:44.941 00:07:44.941 real 0m0.454s 00:07:44.941 user 0m0.635s 00:07:44.941 sys 0m0.183s 00:07:44.941 20:03:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:44.941 20:03:29 -- common/autotest_common.sh@10 -- # set +x 00:07:44.941 ************************************ 00:07:44.941 END TEST accel_dif_functional_tests 00:07:44.941 ************************************ 00:07:44.941 00:07:44.941 real 0m35.642s 00:07:44.941 user 0m36.278s 00:07:44.941 sys 0m6.991s 00:07:44.941 20:03:29 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:44.941 20:03:29 -- common/autotest_common.sh@10 -- # set +x 00:07:44.941 ************************************ 00:07:44.941 END TEST accel 00:07:44.941 ************************************ 00:07:44.941 20:03:29 -- spdk/autotest.sh@180 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:44.941 20:03:29 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:44.941 20:03:29 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:44.941 20:03:29 -- common/autotest_common.sh@10 -- # set +x 00:07:44.941 ************************************ 00:07:44.941 START TEST accel_rpc 00:07:44.941 ************************************ 00:07:44.941 20:03:29 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:45.200 * Looking for test storage... 00:07:45.200 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:45.200 20:03:29 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:45.200 20:03:29 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1620190 00:07:45.200 20:03:29 -- accel/accel_rpc.sh@15 -- # waitforlisten 1620190 00:07:45.200 20:03:29 -- common/autotest_common.sh@827 -- # '[' -z 1620190 ']' 00:07:45.200 20:03:29 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:45.200 20:03:29 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:45.200 20:03:29 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:45.200 20:03:29 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:45.200 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:45.200 20:03:29 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:45.200 20:03:29 -- common/autotest_common.sh@10 -- # set +x 00:07:45.200 [2024-04-26 20:03:29.506309] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:45.200 [2024-04-26 20:03:29.506361] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1620190 ] 00:07:45.200 EAL: No free 2048 kB hugepages reported on node 1 00:07:45.200 [2024-04-26 20:03:29.587890] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.459 [2024-04-26 20:03:29.675513] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.027 20:03:30 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:46.027 20:03:30 -- common/autotest_common.sh@860 -- # return 0 00:07:46.027 20:03:30 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:46.027 20:03:30 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:46.027 20:03:30 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:46.027 20:03:30 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:46.027 20:03:30 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:46.027 20:03:30 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:46.027 20:03:30 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:46.027 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.286 ************************************ 00:07:46.286 START TEST accel_assign_opcode 00:07:46.286 ************************************ 00:07:46.286 20:03:30 -- common/autotest_common.sh@1121 -- # accel_assign_opcode_test_suite 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:46.286 20:03:30 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.286 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.286 [2024-04-26 20:03:30.498078] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:46.286 20:03:30 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:46.286 20:03:30 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.286 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.286 [2024-04-26 20:03:30.506071] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:46.286 20:03:30 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:46.286 20:03:30 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.286 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.286 20:03:30 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:46.286 20:03:30 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:46.286 20:03:30 -- accel/accel_rpc.sh@42 -- # grep software 00:07:46.286 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.286 20:03:30 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:46.545 software 00:07:46.545 00:07:46.545 real 0m0.255s 00:07:46.545 user 0m0.040s 00:07:46.545 sys 0m0.015s 00:07:46.545 20:03:30 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:46.545 20:03:30 -- common/autotest_common.sh@10 -- # set +x 00:07:46.545 ************************************ 00:07:46.545 END TEST accel_assign_opcode 00:07:46.545 ************************************ 00:07:46.545 20:03:30 -- accel/accel_rpc.sh@55 -- # killprocess 1620190 00:07:46.545 20:03:30 -- common/autotest_common.sh@946 -- # '[' -z 1620190 ']' 00:07:46.545 20:03:30 -- common/autotest_common.sh@950 -- # kill -0 1620190 00:07:46.545 20:03:30 -- common/autotest_common.sh@951 -- # uname 00:07:46.545 20:03:30 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:46.545 20:03:30 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1620190 00:07:46.545 20:03:30 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:46.545 20:03:30 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:46.545 20:03:30 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1620190' 00:07:46.545 killing process with pid 1620190 00:07:46.545 20:03:30 -- common/autotest_common.sh@965 -- # kill 1620190 00:07:46.545 20:03:30 -- common/autotest_common.sh@970 -- # wait 1620190 00:07:46.804 00:07:46.804 real 0m1.797s 00:07:46.804 user 0m1.847s 00:07:46.804 sys 0m0.562s 00:07:46.804 20:03:31 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:46.804 20:03:31 -- common/autotest_common.sh@10 -- # set +x 00:07:46.804 ************************************ 00:07:46.804 END TEST accel_rpc 00:07:46.804 ************************************ 00:07:46.804 20:03:31 -- spdk/autotest.sh@181 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:46.804 20:03:31 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:46.804 20:03:31 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:46.804 20:03:31 -- common/autotest_common.sh@10 -- # set +x 00:07:47.063 ************************************ 00:07:47.063 START TEST app_cmdline 00:07:47.063 ************************************ 00:07:47.063 20:03:31 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:47.063 * Looking for test storage... 00:07:47.321 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:47.321 20:03:31 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:47.321 20:03:31 -- app/cmdline.sh@17 -- # spdk_tgt_pid=1620461 00:07:47.321 20:03:31 -- app/cmdline.sh@18 -- # waitforlisten 1620461 00:07:47.321 20:03:31 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:47.321 20:03:31 -- common/autotest_common.sh@827 -- # '[' -z 1620461 ']' 00:07:47.321 20:03:31 -- common/autotest_common.sh@831 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:47.321 20:03:31 -- common/autotest_common.sh@832 -- # local max_retries=100 00:07:47.321 20:03:31 -- common/autotest_common.sh@834 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:47.321 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:47.321 20:03:31 -- common/autotest_common.sh@836 -- # xtrace_disable 00:07:47.321 20:03:31 -- common/autotest_common.sh@10 -- # set +x 00:07:47.321 [2024-04-26 20:03:31.526629] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:47.321 [2024-04-26 20:03:31.526680] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1620461 ] 00:07:47.321 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.321 [2024-04-26 20:03:31.609494] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.321 [2024-04-26 20:03:31.697894] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:48.323 20:03:32 -- common/autotest_common.sh@856 -- # (( i == 0 )) 00:07:48.323 20:03:32 -- common/autotest_common.sh@860 -- # return 0 00:07:48.323 20:03:32 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:48.323 { 00:07:48.323 "version": "SPDK v24.05-pre git sha1 13a9f2aa2", 00:07:48.323 "fields": { 00:07:48.323 "major": 24, 00:07:48.323 "minor": 5, 00:07:48.323 "patch": 0, 00:07:48.323 "suffix": "-pre", 00:07:48.323 "commit": "13a9f2aa2" 00:07:48.323 } 00:07:48.323 } 00:07:48.323 20:03:32 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:48.323 20:03:32 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:48.323 20:03:32 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:48.323 20:03:32 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:48.323 20:03:32 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:48.323 20:03:32 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:48.323 20:03:32 -- app/cmdline.sh@26 -- # sort 00:07:48.323 20:03:32 -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:48.323 20:03:32 -- common/autotest_common.sh@10 -- # set +x 00:07:48.323 20:03:32 -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:48.323 20:03:32 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:48.323 20:03:32 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:48.323 20:03:32 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.323 20:03:32 -- common/autotest_common.sh@648 -- # local es=0 00:07:48.323 20:03:32 -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.323 20:03:32 -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:48.323 20:03:32 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.323 20:03:32 -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:48.323 20:03:32 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.323 20:03:32 -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:48.323 20:03:32 -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:48.323 20:03:32 -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:48.323 20:03:32 -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:48.323 20:03:32 -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:48.323 request: 00:07:48.323 { 00:07:48.323 "method": "env_dpdk_get_mem_stats", 00:07:48.323 "req_id": 1 00:07:48.323 } 00:07:48.323 Got JSON-RPC error response 00:07:48.323 response: 00:07:48.323 { 00:07:48.323 "code": -32601, 00:07:48.323 "message": "Method not found" 00:07:48.323 } 00:07:48.323 20:03:32 -- common/autotest_common.sh@651 -- # es=1 00:07:48.323 20:03:32 -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:48.323 20:03:32 -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:48.323 20:03:32 -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:48.323 20:03:32 -- app/cmdline.sh@1 -- # killprocess 1620461 00:07:48.323 20:03:32 -- common/autotest_common.sh@946 -- # '[' -z 1620461 ']' 00:07:48.323 20:03:32 -- common/autotest_common.sh@950 -- # kill -0 1620461 00:07:48.610 20:03:32 -- common/autotest_common.sh@951 -- # uname 00:07:48.610 20:03:32 -- common/autotest_common.sh@951 -- # '[' Linux = Linux ']' 00:07:48.610 20:03:32 -- common/autotest_common.sh@952 -- # ps --no-headers -o comm= 1620461 00:07:48.610 20:03:32 -- common/autotest_common.sh@952 -- # process_name=reactor_0 00:07:48.610 20:03:32 -- common/autotest_common.sh@956 -- # '[' reactor_0 = sudo ']' 00:07:48.610 20:03:32 -- common/autotest_common.sh@964 -- # echo 'killing process with pid 1620461' 00:07:48.610 killing process with pid 1620461 00:07:48.610 20:03:32 -- common/autotest_common.sh@965 -- # kill 1620461 00:07:48.610 20:03:32 -- common/autotest_common.sh@970 -- # wait 1620461 00:07:48.869 00:07:48.869 real 0m1.732s 00:07:48.869 user 0m1.976s 00:07:48.869 sys 0m0.515s 00:07:48.869 20:03:33 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:48.869 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:48.869 ************************************ 00:07:48.869 END TEST app_cmdline 00:07:48.869 ************************************ 00:07:48.869 20:03:33 -- spdk/autotest.sh@182 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:48.869 20:03:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:48.869 20:03:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:48.869 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:49.127 ************************************ 00:07:49.127 START TEST version 00:07:49.127 ************************************ 00:07:49.127 20:03:33 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:49.127 * Looking for test storage... 00:07:49.127 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:49.127 20:03:33 -- app/version.sh@17 -- # get_header_version major 00:07:49.127 20:03:33 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:49.127 20:03:33 -- app/version.sh@14 -- # cut -f2 00:07:49.127 20:03:33 -- app/version.sh@14 -- # tr -d '"' 00:07:49.127 20:03:33 -- app/version.sh@17 -- # major=24 00:07:49.127 20:03:33 -- app/version.sh@18 -- # get_header_version minor 00:07:49.127 20:03:33 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:49.127 20:03:33 -- app/version.sh@14 -- # cut -f2 00:07:49.127 20:03:33 -- app/version.sh@14 -- # tr -d '"' 00:07:49.127 20:03:33 -- app/version.sh@18 -- # minor=5 00:07:49.127 20:03:33 -- app/version.sh@19 -- # get_header_version patch 00:07:49.127 20:03:33 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:49.127 20:03:33 -- app/version.sh@14 -- # cut -f2 00:07:49.127 20:03:33 -- app/version.sh@14 -- # tr -d '"' 00:07:49.127 20:03:33 -- app/version.sh@19 -- # patch=0 00:07:49.127 20:03:33 -- app/version.sh@20 -- # get_header_version suffix 00:07:49.127 20:03:33 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:49.127 20:03:33 -- app/version.sh@14 -- # cut -f2 00:07:49.127 20:03:33 -- app/version.sh@14 -- # tr -d '"' 00:07:49.127 20:03:33 -- app/version.sh@20 -- # suffix=-pre 00:07:49.127 20:03:33 -- app/version.sh@22 -- # version=24.5 00:07:49.127 20:03:33 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:49.127 20:03:33 -- app/version.sh@28 -- # version=24.5rc0 00:07:49.127 20:03:33 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:49.127 20:03:33 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:49.127 20:03:33 -- app/version.sh@30 -- # py_version=24.5rc0 00:07:49.127 20:03:33 -- app/version.sh@31 -- # [[ 24.5rc0 == \2\4\.\5\r\c\0 ]] 00:07:49.127 00:07:49.127 real 0m0.178s 00:07:49.127 user 0m0.092s 00:07:49.127 sys 0m0.132s 00:07:49.127 20:03:33 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:07:49.127 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:49.127 ************************************ 00:07:49.127 END TEST version 00:07:49.127 ************************************ 00:07:49.127 20:03:33 -- spdk/autotest.sh@184 -- # '[' 0 -eq 1 ']' 00:07:49.127 20:03:33 -- spdk/autotest.sh@194 -- # uname -s 00:07:49.127 20:03:33 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:07:49.127 20:03:33 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:49.127 20:03:33 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:07:49.127 20:03:33 -- spdk/autotest.sh@207 -- # '[' 0 -eq 1 ']' 00:07:49.127 20:03:33 -- spdk/autotest.sh@254 -- # '[' 0 -eq 1 ']' 00:07:49.127 20:03:33 -- spdk/autotest.sh@258 -- # timing_exit lib 00:07:49.127 20:03:33 -- common/autotest_common.sh@726 -- # xtrace_disable 00:07:49.127 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:49.385 20:03:33 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@268 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@277 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@310 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@314 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@328 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@341 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@345 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@354 -- # '[' 0 -eq 1 ']' 00:07:49.385 20:03:33 -- spdk/autotest.sh@361 -- # [[ 0 -eq 1 ]] 00:07:49.385 20:03:33 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:07:49.385 20:03:33 -- spdk/autotest.sh@369 -- # [[ 1 -eq 1 ]] 00:07:49.385 20:03:33 -- spdk/autotest.sh@370 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:49.385 20:03:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:49.385 20:03:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:49.385 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:49.385 ************************************ 00:07:49.385 START TEST llvm_fuzz 00:07:49.385 ************************************ 00:07:49.385 20:03:33 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:49.656 * Looking for test storage... 00:07:49.656 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:49.656 20:03:33 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:49.656 20:03:33 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:49.656 20:03:33 -- common/autotest_common.sh@546 -- # fuzzers=() 00:07:49.656 20:03:33 -- common/autotest_common.sh@546 -- # local fuzzers 00:07:49.656 20:03:33 -- common/autotest_common.sh@548 -- # [[ -n '' ]] 00:07:49.656 20:03:33 -- common/autotest_common.sh@551 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:49.656 20:03:33 -- common/autotest_common.sh@552 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:49.656 20:03:33 -- common/autotest_common.sh@555 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:49.656 20:03:33 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:49.656 20:03:33 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:49.656 20:03:33 -- fuzz/llvm.sh@56 -- # [[ 1 -eq 0 ]] 00:07:49.656 20:03:33 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.656 20:03:33 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:49.656 20:03:33 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.656 20:03:33 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:49.656 20:03:33 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:07:49.656 20:03:33 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:07:49.656 20:03:33 -- fuzz/llvm.sh@62 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:49.656 20:03:33 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:07:49.656 20:03:33 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:07:49.656 20:03:33 -- common/autotest_common.sh@10 -- # set +x 00:07:49.656 ************************************ 00:07:49.656 START TEST nvmf_fuzz 00:07:49.656 ************************************ 00:07:49.656 20:03:34 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:49.917 * Looking for test storage... 00:07:49.917 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.917 20:03:34 -- nvmf/run.sh@60 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:49.917 20:03:34 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:49.917 20:03:34 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:49.917 20:03:34 -- common/autotest_common.sh@34 -- # set -e 00:07:49.917 20:03:34 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:49.917 20:03:34 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:49.917 20:03:34 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:49.917 20:03:34 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:49.917 20:03:34 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:49.917 20:03:34 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:49.917 20:03:34 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:49.917 20:03:34 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:49.917 20:03:34 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:49.917 20:03:34 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:49.917 20:03:34 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:49.917 20:03:34 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:49.917 20:03:34 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:49.917 20:03:34 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:49.917 20:03:34 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:49.917 20:03:34 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:49.917 20:03:34 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:49.917 20:03:34 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:49.917 20:03:34 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:49.917 20:03:34 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:49.917 20:03:34 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:49.917 20:03:34 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:49.917 20:03:34 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:49.917 20:03:34 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:49.917 20:03:34 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:49.917 20:03:34 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:49.917 20:03:34 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:49.917 20:03:34 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:49.917 20:03:34 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:49.918 20:03:34 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:49.918 20:03:34 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:49.918 20:03:34 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:49.918 20:03:34 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:49.918 20:03:34 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:49.918 20:03:34 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:49.918 20:03:34 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:49.918 20:03:34 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:49.918 20:03:34 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:49.918 20:03:34 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:49.918 20:03:34 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:49.918 20:03:34 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:49.918 20:03:34 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:49.918 20:03:34 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:49.918 20:03:34 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:49.918 20:03:34 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:49.918 20:03:34 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:49.918 20:03:34 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:49.918 20:03:34 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:49.918 20:03:34 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:49.918 20:03:34 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:49.918 20:03:34 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:49.918 20:03:34 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:49.918 20:03:34 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:49.918 20:03:34 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:49.918 20:03:34 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:07:49.918 20:03:34 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:07:49.918 20:03:34 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:07:49.918 20:03:34 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:07:49.918 20:03:34 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:07:49.918 20:03:34 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:07:49.918 20:03:34 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:07:49.918 20:03:34 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:07:49.918 20:03:34 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:07:49.918 20:03:34 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:07:49.918 20:03:34 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:07:49.918 20:03:34 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:07:49.918 20:03:34 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:07:49.918 20:03:34 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:49.918 20:03:34 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:07:49.918 20:03:34 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:07:49.918 20:03:34 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:07:49.918 20:03:34 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:07:49.918 20:03:34 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:07:49.918 20:03:34 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:07:49.918 20:03:34 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:07:49.918 20:03:34 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:07:49.918 20:03:34 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:07:49.918 20:03:34 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:07:49.918 20:03:34 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:07:49.918 20:03:34 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:49.918 20:03:34 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:07:49.918 20:03:34 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:07:49.918 20:03:34 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:49.918 20:03:34 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:49.918 20:03:34 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:49.918 20:03:34 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:49.918 20:03:34 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:49.918 20:03:34 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.918 20:03:34 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:49.918 20:03:34 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.918 20:03:34 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:49.918 20:03:34 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:49.918 20:03:34 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:49.918 20:03:34 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:49.918 20:03:34 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:49.918 20:03:34 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:49.918 20:03:34 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:49.918 20:03:34 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:49.918 #define SPDK_CONFIG_H 00:07:49.918 #define SPDK_CONFIG_APPS 1 00:07:49.918 #define SPDK_CONFIG_ARCH native 00:07:49.918 #undef SPDK_CONFIG_ASAN 00:07:49.918 #undef SPDK_CONFIG_AVAHI 00:07:49.918 #undef SPDK_CONFIG_CET 00:07:49.918 #define SPDK_CONFIG_COVERAGE 1 00:07:49.918 #define SPDK_CONFIG_CROSS_PREFIX 00:07:49.918 #undef SPDK_CONFIG_CRYPTO 00:07:49.918 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:49.918 #undef SPDK_CONFIG_CUSTOMOCF 00:07:49.918 #undef SPDK_CONFIG_DAOS 00:07:49.918 #define SPDK_CONFIG_DAOS_DIR 00:07:49.918 #define SPDK_CONFIG_DEBUG 1 00:07:49.918 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:49.918 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:49.918 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:49.918 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:49.918 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:49.918 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:49.918 #define SPDK_CONFIG_EXAMPLES 1 00:07:49.918 #undef SPDK_CONFIG_FC 00:07:49.918 #define SPDK_CONFIG_FC_PATH 00:07:49.918 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:49.918 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:49.918 #undef SPDK_CONFIG_FUSE 00:07:49.918 #define SPDK_CONFIG_FUZZER 1 00:07:49.918 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:07:49.918 #undef SPDK_CONFIG_GOLANG 00:07:49.918 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:49.918 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:07:49.918 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:49.918 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:07:49.918 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:49.918 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:49.918 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:49.918 #define SPDK_CONFIG_IDXD 1 00:07:49.918 #undef SPDK_CONFIG_IDXD_KERNEL 00:07:49.918 #undef SPDK_CONFIG_IPSEC_MB 00:07:49.918 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:49.918 #define SPDK_CONFIG_ISAL 1 00:07:49.918 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:49.918 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:49.918 #define SPDK_CONFIG_LIBDIR 00:07:49.918 #undef SPDK_CONFIG_LTO 00:07:49.918 #define SPDK_CONFIG_MAX_LCORES 00:07:49.918 #define SPDK_CONFIG_NVME_CUSE 1 00:07:49.918 #undef SPDK_CONFIG_OCF 00:07:49.918 #define SPDK_CONFIG_OCF_PATH 00:07:49.918 #define SPDK_CONFIG_OPENSSL_PATH 00:07:49.918 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:49.918 #define SPDK_CONFIG_PGO_DIR 00:07:49.918 #undef SPDK_CONFIG_PGO_USE 00:07:49.918 #define SPDK_CONFIG_PREFIX /usr/local 00:07:49.918 #undef SPDK_CONFIG_RAID5F 00:07:49.918 #undef SPDK_CONFIG_RBD 00:07:49.918 #define SPDK_CONFIG_RDMA 1 00:07:49.918 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:49.918 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:49.918 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:49.918 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:49.918 #undef SPDK_CONFIG_SHARED 00:07:49.918 #undef SPDK_CONFIG_SMA 00:07:49.918 #define SPDK_CONFIG_TESTS 1 00:07:49.918 #undef SPDK_CONFIG_TSAN 00:07:49.918 #define SPDK_CONFIG_UBLK 1 00:07:49.918 #define SPDK_CONFIG_UBSAN 1 00:07:49.918 #undef SPDK_CONFIG_UNIT_TESTS 00:07:49.918 #undef SPDK_CONFIG_URING 00:07:49.918 #define SPDK_CONFIG_URING_PATH 00:07:49.918 #undef SPDK_CONFIG_URING_ZNS 00:07:49.918 #undef SPDK_CONFIG_USDT 00:07:49.918 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:49.918 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:49.918 #define SPDK_CONFIG_VFIO_USER 1 00:07:49.918 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:49.918 #define SPDK_CONFIG_VHOST 1 00:07:49.918 #define SPDK_CONFIG_VIRTIO 1 00:07:49.918 #undef SPDK_CONFIG_VTUNE 00:07:49.918 #define SPDK_CONFIG_VTUNE_DIR 00:07:49.918 #define SPDK_CONFIG_WERROR 1 00:07:49.918 #define SPDK_CONFIG_WPDK_DIR 00:07:49.918 #undef SPDK_CONFIG_XNVME 00:07:49.918 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:49.918 20:03:34 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:49.918 20:03:34 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:49.918 20:03:34 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:49.918 20:03:34 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:49.918 20:03:34 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:49.918 20:03:34 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.919 20:03:34 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.919 20:03:34 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.919 20:03:34 -- paths/export.sh@5 -- # export PATH 00:07:49.919 20:03:34 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:49.919 20:03:34 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:49.919 20:03:34 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:49.919 20:03:34 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:49.919 20:03:34 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:49.919 20:03:34 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:49.919 20:03:34 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:49.919 20:03:34 -- pm/common@67 -- # TEST_TAG=N/A 00:07:49.919 20:03:34 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:49.919 20:03:34 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:49.919 20:03:34 -- pm/common@71 -- # uname -s 00:07:49.919 20:03:34 -- pm/common@71 -- # PM_OS=Linux 00:07:49.919 20:03:34 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:07:49.919 20:03:34 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:07:49.919 20:03:34 -- pm/common@76 -- # [[ Linux == Linux ]] 00:07:49.919 20:03:34 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:07:49.919 20:03:34 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:07:49.919 20:03:34 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:07:49.919 20:03:34 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:07:49.919 20:03:34 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:07:49.919 20:03:34 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:07:49.919 20:03:34 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:07:49.919 20:03:34 -- common/autotest_common.sh@57 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:07:49.919 20:03:34 -- common/autotest_common.sh@61 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:49.919 20:03:34 -- common/autotest_common.sh@63 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:07:49.919 20:03:34 -- common/autotest_common.sh@65 -- # : 1 00:07:49.919 20:03:34 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:49.919 20:03:34 -- common/autotest_common.sh@67 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:07:49.919 20:03:34 -- common/autotest_common.sh@69 -- # : 00:07:49.919 20:03:34 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:07:49.919 20:03:34 -- common/autotest_common.sh@71 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:07:49.919 20:03:34 -- common/autotest_common.sh@73 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:07:49.919 20:03:34 -- common/autotest_common.sh@75 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:07:49.919 20:03:34 -- common/autotest_common.sh@77 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:49.919 20:03:34 -- common/autotest_common.sh@79 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:07:49.919 20:03:34 -- common/autotest_common.sh@81 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:07:49.919 20:03:34 -- common/autotest_common.sh@83 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:07:49.919 20:03:34 -- common/autotest_common.sh@85 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:07:49.919 20:03:34 -- common/autotest_common.sh@87 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:07:49.919 20:03:34 -- common/autotest_common.sh@89 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:07:49.919 20:03:34 -- common/autotest_common.sh@91 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:07:49.919 20:03:34 -- common/autotest_common.sh@93 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:07:49.919 20:03:34 -- common/autotest_common.sh@95 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:49.919 20:03:34 -- common/autotest_common.sh@97 -- # : 1 00:07:49.919 20:03:34 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:07:49.919 20:03:34 -- common/autotest_common.sh@99 -- # : 1 00:07:49.919 20:03:34 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:07:49.919 20:03:34 -- common/autotest_common.sh@101 -- # : rdma 00:07:49.919 20:03:34 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:49.919 20:03:34 -- common/autotest_common.sh@103 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:07:49.919 20:03:34 -- common/autotest_common.sh@105 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:07:49.919 20:03:34 -- common/autotest_common.sh@107 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:07:49.919 20:03:34 -- common/autotest_common.sh@109 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:07:49.919 20:03:34 -- common/autotest_common.sh@111 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:07:49.919 20:03:34 -- common/autotest_common.sh@113 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:07:49.919 20:03:34 -- common/autotest_common.sh@115 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:07:49.919 20:03:34 -- common/autotest_common.sh@117 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:49.919 20:03:34 -- common/autotest_common.sh@119 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:07:49.919 20:03:34 -- common/autotest_common.sh@121 -- # : 1 00:07:49.919 20:03:34 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:07:49.919 20:03:34 -- common/autotest_common.sh@123 -- # : 00:07:49.919 20:03:34 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:49.919 20:03:34 -- common/autotest_common.sh@125 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:07:49.919 20:03:34 -- common/autotest_common.sh@127 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:07:49.919 20:03:34 -- common/autotest_common.sh@129 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:07:49.919 20:03:34 -- common/autotest_common.sh@131 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:07:49.919 20:03:34 -- common/autotest_common.sh@133 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:07:49.919 20:03:34 -- common/autotest_common.sh@135 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:07:49.919 20:03:34 -- common/autotest_common.sh@137 -- # : 00:07:49.919 20:03:34 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:07:49.919 20:03:34 -- common/autotest_common.sh@139 -- # : true 00:07:49.919 20:03:34 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:07:49.919 20:03:34 -- common/autotest_common.sh@141 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:07:49.919 20:03:34 -- common/autotest_common.sh@143 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:07:49.919 20:03:34 -- common/autotest_common.sh@145 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:07:49.919 20:03:34 -- common/autotest_common.sh@147 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:07:49.919 20:03:34 -- common/autotest_common.sh@149 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:07:49.919 20:03:34 -- common/autotest_common.sh@151 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:07:49.919 20:03:34 -- common/autotest_common.sh@153 -- # : 00:07:49.919 20:03:34 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:07:49.919 20:03:34 -- common/autotest_common.sh@155 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:07:49.919 20:03:34 -- common/autotest_common.sh@157 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:07:49.919 20:03:34 -- common/autotest_common.sh@159 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:07:49.919 20:03:34 -- common/autotest_common.sh@161 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:07:49.919 20:03:34 -- common/autotest_common.sh@163 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:07:49.919 20:03:34 -- common/autotest_common.sh@166 -- # : 00:07:49.919 20:03:34 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:07:49.919 20:03:34 -- common/autotest_common.sh@168 -- # : 0 00:07:49.919 20:03:34 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:07:49.919 20:03:34 -- common/autotest_common.sh@170 -- # : 0 00:07:49.920 20:03:34 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:49.920 20:03:34 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:49.920 20:03:34 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:49.920 20:03:34 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:49.920 20:03:34 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:49.920 20:03:34 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:07:49.920 20:03:34 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:49.920 20:03:34 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:49.920 20:03:34 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:49.920 20:03:34 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:49.920 20:03:34 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:49.920 20:03:34 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:07:49.920 20:03:34 -- common/autotest_common.sh@199 -- # cat 00:07:49.920 20:03:34 -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:07:49.920 20:03:34 -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:49.920 20:03:34 -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:49.920 20:03:34 -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:49.920 20:03:34 -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:49.920 20:03:34 -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:07:49.920 20:03:34 -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:07:49.920 20:03:34 -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.920 20:03:34 -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:49.920 20:03:34 -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.920 20:03:34 -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:49.920 20:03:34 -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:49.920 20:03:34 -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:49.920 20:03:34 -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:49.920 20:03:34 -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:49.920 20:03:34 -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:49.920 20:03:34 -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:49.920 20:03:34 -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:07:49.920 20:03:34 -- common/autotest_common.sh@262 -- # export valgrind= 00:07:49.920 20:03:34 -- common/autotest_common.sh@262 -- # valgrind= 00:07:49.920 20:03:34 -- common/autotest_common.sh@268 -- # uname -s 00:07:49.920 20:03:34 -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:07:49.920 20:03:34 -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:07:49.920 20:03:34 -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:07:49.920 20:03:34 -- common/autotest_common.sh@271 -- # [[ 0 -eq 1 ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@271 -- # [[ 0 -eq 1 ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@278 -- # MAKE=make 00:07:49.920 20:03:34 -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:07:49.920 20:03:34 -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:07:49.920 20:03:34 -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:07:49.920 20:03:34 -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:07:49.920 20:03:34 -- common/autotest_common.sh@298 -- # TEST_MODE= 00:07:49.920 20:03:34 -- common/autotest_common.sh@317 -- # [[ -z 1620998 ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@317 -- # kill -0 1620998 00:07:49.920 20:03:34 -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:07:49.920 20:03:34 -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:07:49.920 20:03:34 -- common/autotest_common.sh@330 -- # local mount target_dir 00:07:49.920 20:03:34 -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:07:49.920 20:03:34 -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:07:49.920 20:03:34 -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:07:49.920 20:03:34 -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:07:49.920 20:03:34 -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.6Z2mcI 00:07:49.920 20:03:34 -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:49.920 20:03:34 -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:07:49.920 20:03:34 -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.6Z2mcI/tests/nvmf /tmp/spdk.6Z2mcI 00:07:49.920 20:03:34 -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:07:49.920 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.920 20:03:34 -- common/autotest_common.sh@326 -- # df -T 00:07:49.920 20:03:34 -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:07:49.920 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:07:49.920 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=818380800 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:07:49.920 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=4466049024 00:07:49.920 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=87036264448 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508552192 00:07:49.920 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=7472287744 00:07:49.920 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=47251660800 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254274048 00:07:49.920 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=2613248 00:07:49.920 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=18895634432 00:07:49.920 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901712896 00:07:49.920 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=6078464 00:07:49.921 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.921 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:07:49.921 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:07:49.921 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=47253700608 00:07:49.921 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254278144 00:07:49.921 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=577536 00:07:49.921 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.921 20:03:34 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:07:49.921 20:03:34 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:07:49.921 20:03:34 -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:07:49.921 20:03:34 -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:07:49.921 20:03:34 -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:07:49.921 20:03:34 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:07:49.921 20:03:34 -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:07:49.921 * Looking for test storage... 00:07:49.921 20:03:34 -- common/autotest_common.sh@367 -- # local target_space new_size 00:07:49.921 20:03:34 -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:07:49.921 20:03:34 -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.921 20:03:34 -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:49.921 20:03:34 -- common/autotest_common.sh@371 -- # mount=/ 00:07:49.921 20:03:34 -- common/autotest_common.sh@373 -- # target_space=87036264448 00:07:49.921 20:03:34 -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:07:49.921 20:03:34 -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:07:49.921 20:03:34 -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:07:49.921 20:03:34 -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:07:49.921 20:03:34 -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:07:49.921 20:03:34 -- common/autotest_common.sh@380 -- # new_size=9686880256 00:07:49.921 20:03:34 -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:49.921 20:03:34 -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.921 20:03:34 -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.921 20:03:34 -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.921 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:49.921 20:03:34 -- common/autotest_common.sh@388 -- # return 0 00:07:49.921 20:03:34 -- common/autotest_common.sh@1678 -- # set -o errtrace 00:07:49.921 20:03:34 -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:07:49.921 20:03:34 -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:49.921 20:03:34 -- common/autotest_common.sh@1682 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:49.921 20:03:34 -- common/autotest_common.sh@1683 -- # true 00:07:49.921 20:03:34 -- common/autotest_common.sh@1685 -- # xtrace_fd 00:07:49.921 20:03:34 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:49.921 20:03:34 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:49.921 20:03:34 -- common/autotest_common.sh@27 -- # exec 00:07:49.921 20:03:34 -- common/autotest_common.sh@29 -- # exec 00:07:49.921 20:03:34 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:49.921 20:03:34 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:49.921 20:03:34 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:49.921 20:03:34 -- common/autotest_common.sh@18 -- # set -x 00:07:49.921 20:03:34 -- nvmf/run.sh@61 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:49.921 20:03:34 -- ../common.sh@8 -- # pids=() 00:07:49.921 20:03:34 -- nvmf/run.sh@63 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:49.921 20:03:34 -- nvmf/run.sh@64 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:49.921 20:03:34 -- nvmf/run.sh@64 -- # fuzz_num=25 00:07:49.921 20:03:34 -- nvmf/run.sh@65 -- # (( fuzz_num != 0 )) 00:07:49.921 20:03:34 -- nvmf/run.sh@67 -- # trap 'cleanup /tmp/llvm_fuzz* /var/tmp/suppress_nvmf_fuzz; exit 1' SIGINT SIGTERM EXIT 00:07:49.921 20:03:34 -- nvmf/run.sh@69 -- # mem_size=512 00:07:49.921 20:03:34 -- nvmf/run.sh@70 -- # [[ 1 -eq 1 ]] 00:07:49.921 20:03:34 -- nvmf/run.sh@71 -- # start_llvm_fuzz_short 25 1 00:07:49.921 20:03:34 -- ../common.sh@69 -- # local fuzz_num=25 00:07:49.921 20:03:34 -- ../common.sh@70 -- # local time=1 00:07:49.921 20:03:34 -- ../common.sh@72 -- # (( i = 0 )) 00:07:49.921 20:03:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.921 20:03:34 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:49.921 20:03:34 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:49.921 20:03:34 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.921 20:03:34 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.921 20:03:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:49.921 20:03:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:49.921 20:03:34 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:49.921 20:03:34 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:49.921 20:03:34 -- nvmf/run.sh@34 -- # printf %02d 0 00:07:49.921 20:03:34 -- nvmf/run.sh@34 -- # port=4400 00:07:49.921 20:03:34 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:49.921 20:03:34 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:49.921 20:03:34 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.921 20:03:34 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:49.921 20:03:34 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:49.921 20:03:34 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 00:07:49.921 [2024-04-26 20:03:34.289705] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:49.921 [2024-04-26 20:03:34.289784] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621045 ] 00:07:49.921 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.180 [2024-04-26 20:03:34.479360] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.180 [2024-04-26 20:03:34.550853] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.180 [2024-04-26 20:03:34.610378] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.438 [2024-04-26 20:03:34.626601] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:50.438 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.438 INFO: Seed: 1072649807 00:07:50.438 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:07:50.438 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:07:50.438 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:50.438 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.438 #2 INITED exec/s: 0 rss: 62Mb 00:07:50.438 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.438 This may also happen if the target rejected all inputs we tried so far 00:07:50.438 [2024-04-26 20:03:34.681821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.438 [2024-04-26 20:03:34.681850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.696 NEW_FUNC[1/669]: 0x481d00 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:50.696 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.696 #8 NEW cov: 11609 ft: 11608 corp: 2/65b lim: 320 exec/s: 0 rss: 69Mb L: 64/64 MS: 1 InsertRepeatedBytes- 00:07:50.696 [2024-04-26 20:03:35.002673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:50.696 [2024-04-26 20:03:35.002707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.696 [2024-04-26 20:03:35.002765] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:00006f6f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:50.696 [2024-04-26 20:03:35.002779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.696 NEW_FUNC[1/2]: 0x1309f20 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2027 00:07:50.696 NEW_FUNC[2/2]: 0x170e3a0 in nvme_get_sgl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:159 00:07:50.696 #9 NEW cov: 11793 ft: 12264 corp: 3/206b lim: 320 exec/s: 0 rss: 70Mb L: 141/141 MS: 1 InsertRepeatedBytes- 00:07:50.696 [2024-04-26 20:03:35.052698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.696 [2024-04-26 20:03:35.052725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.696 #10 NEW cov: 11799 ft: 12617 corp: 4/270b lim: 320 exec/s: 0 rss: 70Mb L: 64/141 MS: 1 ShuffleBytes- 00:07:50.696 [2024-04-26 20:03:35.092808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.696 [2024-04-26 20:03:35.092833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.696 #11 NEW cov: 11884 ft: 12922 corp: 5/335b lim: 320 exec/s: 0 rss: 70Mb L: 65/141 MS: 1 CrossOver- 00:07:50.696 [2024-04-26 20:03:35.132904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:50.696 [2024-04-26 20:03:35.132929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 #15 NEW cov: 11884 ft: 13016 corp: 6/399b lim: 320 exec/s: 0 rss: 70Mb L: 64/141 MS: 4 EraseBytes-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:07:50.954 [2024-04-26 20:03:35.173005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:50.954 [2024-04-26 20:03:35.173029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 #16 NEW cov: 11884 ft: 13095 corp: 7/506b lim: 320 exec/s: 0 rss: 70Mb L: 107/141 MS: 1 InsertRepeatedBytes- 00:07:50.954 [2024-04-26 20:03:35.213205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.213230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 [2024-04-26 20:03:35.213288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:00006f6f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.213302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.954 #17 NEW cov: 11884 ft: 13168 corp: 8/647b lim: 320 exec/s: 0 rss: 70Mb L: 141/141 MS: 1 ShuffleBytes- 00:07:50.954 [2024-04-26 20:03:35.253327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.253354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 [2024-04-26 20:03:35.253427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:08006f6f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.253442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.954 #18 NEW cov: 11884 ft: 13230 corp: 9/788b lim: 320 exec/s: 0 rss: 70Mb L: 141/141 MS: 1 ChangeBit- 00:07:50.954 [2024-04-26 20:03:35.293586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.293610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 [2024-04-26 20:03:35.293665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (85) qid:0 cid:5 nsid:85858585 cdw10:85858585 cdw11:85858585 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:50.954 [2024-04-26 20:03:35.293678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.954 [2024-04-26 20:03:35.293733] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:6 nsid:6f6f6f6f cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.293746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.954 NEW_FUNC[1/1]: 0x170ef00 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:50.954 #19 NEW cov: 11898 ft: 13784 corp: 10/989b lim: 320 exec/s: 0 rss: 70Mb L: 201/201 MS: 1 InsertRepeatedBytes- 00:07:50.954 [2024-04-26 20:03:35.343497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:50.954 [2024-04-26 20:03:35.343521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 #20 NEW cov: 11898 ft: 13803 corp: 11/1054b lim: 320 exec/s: 0 rss: 70Mb L: 65/201 MS: 1 InsertByte- 00:07:50.954 [2024-04-26 20:03:35.383700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:50.954 [2024-04-26 20:03:35.383724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.954 [2024-04-26 20:03:35.383782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x8006f6f6f6f6f 00:07:50.955 [2024-04-26 20:03:35.383795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.213 #21 NEW cov: 11898 ft: 13858 corp: 12/1182b lim: 320 exec/s: 0 rss: 70Mb L: 128/201 MS: 1 EraseBytes- 00:07:51.213 [2024-04-26 20:03:35.423722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.423746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 #22 NEW cov: 11898 ft: 13864 corp: 13/1247b lim: 320 exec/s: 0 rss: 70Mb L: 65/201 MS: 1 InsertByte- 00:07:51.213 [2024-04-26 20:03:35.453795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.453820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 #23 NEW cov: 11898 ft: 13910 corp: 14/1319b lim: 320 exec/s: 0 rss: 70Mb L: 72/201 MS: 1 CMP- DE: "\000\012\336\254=\263\214\264"- 00:07:51.213 [2024-04-26 20:03:35.493914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.493942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 #24 NEW cov: 11898 ft: 13945 corp: 15/1427b lim: 320 exec/s: 0 rss: 70Mb L: 108/201 MS: 1 InsertByte- 00:07:51.213 [2024-04-26 20:03:35.534022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.534046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.213 #30 NEW cov: 11921 ft: 13948 corp: 16/1492b lim: 320 exec/s: 0 rss: 70Mb L: 65/201 MS: 1 ChangeBinInt- 00:07:51.213 [2024-04-26 20:03:35.574262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:8a8a8a8a cdw11:8a8a8a8a 00:07:51.213 [2024-04-26 20:03:35.574286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 [2024-04-26 20:03:35.574340] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8a) qid:0 cid:5 nsid:8a8a8a8a cdw10:8a8a8a8a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x8a8a8a8a8a8a8a8a 00:07:51.213 [2024-04-26 20:03:35.574355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.213 #31 NEW cov: 11921 ft: 13960 corp: 17/1631b lim: 320 exec/s: 0 rss: 70Mb L: 139/201 MS: 1 InsertRepeatedBytes- 00:07:51.213 [2024-04-26 20:03:35.614253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.614278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.213 #32 NEW cov: 11921 ft: 14015 corp: 18/1695b lim: 320 exec/s: 0 rss: 70Mb L: 64/201 MS: 1 ChangeBinInt- 00:07:51.213 [2024-04-26 20:03:35.654421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:51.213 [2024-04-26 20:03:35.654448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 #33 NEW cov: 11921 ft: 14028 corp: 19/1767b lim: 320 exec/s: 33 rss: 71Mb L: 72/201 MS: 1 ChangeBinInt- 00:07:51.472 [2024-04-26 20:03:35.694467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.472 [2024-04-26 20:03:35.694491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 #34 NEW cov: 11921 ft: 14050 corp: 20/1833b lim: 320 exec/s: 34 rss: 71Mb L: 66/201 MS: 1 InsertByte- 00:07:51.472 [2024-04-26 20:03:35.734701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.472 [2024-04-26 20:03:35.734725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 [2024-04-26 20:03:35.734782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:b48cb33d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.472 [2024-04-26 20:03:35.734797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.472 #35 NEW cov: 11921 ft: 14063 corp: 21/1974b lim: 320 exec/s: 35 rss: 71Mb L: 141/201 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.472 [2024-04-26 20:03:35.774835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:8a8a8a8a cdw11:8a8a8a8a 00:07:51.472 [2024-04-26 20:03:35.774859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 [2024-04-26 20:03:35.774918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8a) qid:0 cid:5 nsid:8a8a8a8a cdw10:8a8a8a8a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x8a8a8a8a8a8a8a8a 00:07:51.472 [2024-04-26 20:03:35.774936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.472 #36 NEW cov: 11921 ft: 14127 corp: 22/2113b lim: 320 exec/s: 36 rss: 71Mb L: 139/201 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.472 [2024-04-26 20:03:35.814828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.472 [2024-04-26 20:03:35.814854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 #37 NEW cov: 11921 ft: 14149 corp: 23/2186b lim: 320 exec/s: 37 rss: 71Mb L: 73/201 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.472 [2024-04-26 20:03:35.854939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:51.472 [2024-04-26 20:03:35.854963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 #38 NEW cov: 11921 ft: 14160 corp: 24/2259b lim: 320 exec/s: 38 rss: 71Mb L: 73/201 MS: 1 InsertByte- 00:07:51.472 [2024-04-26 20:03:35.895058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:51.472 [2024-04-26 20:03:35.895082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.472 #39 NEW cov: 11921 ft: 14182 corp: 25/2332b lim: 320 exec/s: 39 rss: 71Mb L: 73/201 MS: 1 InsertByte- 00:07:51.731 [2024-04-26 20:03:35.935329] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.731 [2024-04-26 20:03:35.935353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:35.935412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.731 [2024-04-26 20:03:35.935426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:35.935482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:6f6f6f6f cdw11:6f6f6f6f SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:51.731 [2024-04-26 20:03:35.935496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.731 #40 NEW cov: 11921 ft: 14285 corp: 26/2545b lim: 320 exec/s: 40 rss: 71Mb L: 213/213 MS: 1 InsertRepeatedBytes- 00:07:51.731 [2024-04-26 20:03:35.975289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:6f6f6f00 00:07:51.731 [2024-04-26 20:03:35.975314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 #41 NEW cov: 11921 ft: 14301 corp: 27/2610b lim: 320 exec/s: 41 rss: 71Mb L: 65/213 MS: 1 CrossOver- 00:07:51.731 [2024-04-26 20:03:36.015380] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.731 [2024-04-26 20:03:36.015404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 #42 NEW cov: 11921 ft: 14315 corp: 28/2675b lim: 320 exec/s: 42 rss: 71Mb L: 65/213 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.731 [2024-04-26 20:03:36.045643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.731 [2024-04-26 20:03:36.045666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:36.045727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.731 [2024-04-26 20:03:36.045741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:36.045798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:ffffffff cdw10:006f6f6f cdw11:3dacde0a SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:51.731 [2024-04-26 20:03:36.045811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.731 #43 NEW cov: 11921 ft: 14369 corp: 29/2896b lim: 320 exec/s: 43 rss: 71Mb L: 221/221 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.731 [2024-04-26 20:03:36.085568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62620000 cdw10:0000b462 cdw11:00000000 00:07:51.731 [2024-04-26 20:03:36.085592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 #46 NEW cov: 11921 ft: 14378 corp: 30/3013b lim: 320 exec/s: 46 rss: 71Mb L: 117/221 MS: 3 CrossOver-ChangeBit-CrossOver- 00:07:51.731 [2024-04-26 20:03:36.115673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:62626262 cdw11:00000062 00:07:51.731 [2024-04-26 20:03:36.115699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 #47 NEW cov: 11921 ft: 14405 corp: 31/3113b lim: 320 exec/s: 47 rss: 72Mb L: 100/221 MS: 1 CopyPart- 00:07:51.731 [2024-04-26 20:03:36.155974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.731 [2024-04-26 20:03:36.155998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:36.156055] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.731 [2024-04-26 20:03:36.156069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.731 [2024-04-26 20:03:36.156121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:4ffffff cdw10:6f6f6f6f cdw11:6f6f6f6f SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:51.731 [2024-04-26 20:03:36.156135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.990 #48 NEW cov: 11921 ft: 14414 corp: 32/3326b lim: 320 exec/s: 48 rss: 72Mb L: 213/221 MS: 1 ChangeBinInt- 00:07:51.990 [2024-04-26 20:03:36.196025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:8a8a8a00 cdw11:8a8a8a8a 00:07:51.990 [2024-04-26 20:03:36.196050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 [2024-04-26 20:03:36.196107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (8a) qid:0 cid:5 nsid:8a8a8a8a cdw10:8a8a8a8a cdw11:8a8a8a8a SGL TRANSPORT DATA BLOCK TRANSPORT 0x8a8a8a8a8a8a8a8a 00:07:51.991 [2024-04-26 20:03:36.196120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.991 #49 NEW cov: 11921 ft: 14427 corp: 33/3473b lim: 320 exec/s: 49 rss: 72Mb L: 147/221 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.991 [2024-04-26 20:03:36.236031] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:10000 cdw10:00000000 cdw11:00000000 00:07:51.991 [2024-04-26 20:03:36.236055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 #50 NEW cov: 11921 ft: 14441 corp: 34/3537b lim: 320 exec/s: 50 rss: 72Mb L: 64/221 MS: 1 ChangeBinInt- 00:07:51.991 [2024-04-26 20:03:36.266114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00fb0000 cdw11:00000000 00:07:51.991 [2024-04-26 20:03:36.266138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 #51 NEW cov: 11921 ft: 14453 corp: 35/3646b lim: 320 exec/s: 51 rss: 72Mb L: 109/221 MS: 1 CopyPart- 00:07:51.991 [2024-04-26 20:03:36.306243] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.991 [2024-04-26 20:03:36.306267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 #52 NEW cov: 11921 ft: 14458 corp: 36/3720b lim: 320 exec/s: 52 rss: 72Mb L: 74/221 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:51.991 [2024-04-26 20:03:36.346354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:51.991 [2024-04-26 20:03:36.346378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 #53 NEW cov: 11921 ft: 14526 corp: 37/3785b lim: 320 exec/s: 53 rss: 72Mb L: 65/221 MS: 1 ChangeByte- 00:07:51.991 [2024-04-26 20:03:36.386563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.991 [2024-04-26 20:03:36.386588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 [2024-04-26 20:03:36.386645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:00006f6f cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.991 [2024-04-26 20:03:36.386659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.991 #54 NEW cov: 11921 ft: 14537 corp: 38/3926b lim: 320 exec/s: 54 rss: 72Mb L: 141/221 MS: 1 ShuffleBytes- 00:07:51.991 [2024-04-26 20:03:36.426628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:51.991 [2024-04-26 20:03:36.426651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.991 [2024-04-26 20:03:36.426707] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:b48cb33d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:51.991 [2024-04-26 20:03:36.426720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.250 #55 NEW cov: 11921 ft: 14546 corp: 39/4067b lim: 320 exec/s: 55 rss: 72Mb L: 141/221 MS: 1 PersAutoDict- DE: "\000\012\336\254=\263\214\264"- 00:07:52.250 [2024-04-26 20:03:36.466700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:52.250 [2024-04-26 20:03:36.466725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 #56 NEW cov: 11921 ft: 14553 corp: 40/4139b lim: 320 exec/s: 56 rss: 72Mb L: 72/221 MS: 1 CopyPart- 00:07:52.250 [2024-04-26 20:03:36.506764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:8a8a8a8a cdw11:8a8a8a8a 00:07:52.250 [2024-04-26 20:03:36.506788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 #57 NEW cov: 11921 ft: 14555 corp: 41/4264b lim: 320 exec/s: 57 rss: 72Mb L: 125/221 MS: 1 EraseBytes- 00:07:52.250 [2024-04-26 20:03:36.547103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:52.250 [2024-04-26 20:03:36.547127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 [2024-04-26 20:03:36.547188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:52.250 [2024-04-26 20:03:36.547202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.250 [2024-04-26 20:03:36.547257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:4ffffff cdw10:6f6f6f6f cdw11:6f6f6f6f SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.250 [2024-04-26 20:03:36.547271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.250 #58 NEW cov: 11921 ft: 14585 corp: 42/4477b lim: 320 exec/s: 58 rss: 72Mb L: 213/221 MS: 1 ChangeBit- 00:07:52.250 [2024-04-26 20:03:36.587039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:52.250 [2024-04-26 20:03:36.587063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 #59 NEW cov: 11921 ft: 14594 corp: 43/4542b lim: 320 exec/s: 59 rss: 72Mb L: 65/221 MS: 1 ChangeByte- 00:07:52.250 [2024-04-26 20:03:36.617107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:62626262 cdw10:00000000 cdw11:00000000 00:07:52.250 [2024-04-26 20:03:36.617131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 #60 NEW cov: 11921 ft: 14640 corp: 44/4640b lim: 320 exec/s: 60 rss: 72Mb L: 98/221 MS: 1 InsertRepeatedBytes- 00:07:52.250 [2024-04-26 20:03:36.657388] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:6f6f6f6f cdw11:6f6f6f6f 00:07:52.250 [2024-04-26 20:03:36.657413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.250 [2024-04-26 20:03:36.657469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (6f) qid:0 cid:5 nsid:6f6f6f6f cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x6f6f6f6f6f6f6f6f 00:07:52.250 [2024-04-26 20:03:36.657483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.250 [2024-04-26 20:03:36.657535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (ff) qid:0 cid:6 nsid:4ffffff cdw10:6f6f6f6f cdw11:e6e6e6e6 SGL TRANSPORT DATA BLOCK TRANSPORT 0xffffffffffffffff 00:07:52.250 [2024-04-26 20:03:36.657550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.250 #61 NEW cov: 11921 ft: 14700 corp: 45/4859b lim: 320 exec/s: 30 rss: 72Mb L: 219/221 MS: 1 InsertRepeatedBytes- 00:07:52.250 #61 DONE cov: 11921 ft: 14700 corp: 45/4859b lim: 320 exec/s: 30 rss: 72Mb 00:07:52.250 ###### Recommended dictionary. ###### 00:07:52.250 "\000\012\336\254=\263\214\264" # Uses: 8 00:07:52.250 ###### End of recommended dictionary. ###### 00:07:52.250 Done 61 runs in 2 second(s) 00:07:52.509 20:03:36 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_0.conf /var/tmp/suppress_nvmf_fuzz 00:07:52.509 20:03:36 -- ../common.sh@72 -- # (( i++ )) 00:07:52.509 20:03:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.509 20:03:36 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:52.509 20:03:36 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:52.509 20:03:36 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.509 20:03:36 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.509 20:03:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:52.509 20:03:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:52.509 20:03:36 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:52.509 20:03:36 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:52.509 20:03:36 -- nvmf/run.sh@34 -- # printf %02d 1 00:07:52.509 20:03:36 -- nvmf/run.sh@34 -- # port=4401 00:07:52.509 20:03:36 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:52.509 20:03:36 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:52.509 20:03:36 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.509 20:03:36 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:52.509 20:03:36 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:52.510 20:03:36 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 00:07:52.510 [2024-04-26 20:03:36.858430] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:52.510 [2024-04-26 20:03:36.858499] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621400 ] 00:07:52.510 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.768 [2024-04-26 20:03:37.058011] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.768 [2024-04-26 20:03:37.129428] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.768 [2024-04-26 20:03:37.188856] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.768 [2024-04-26 20:03:37.205089] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:53.027 INFO: Running with entropic power schedule (0xFF, 100). 00:07:53.027 INFO: Seed: 3651645246 00:07:53.027 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:07:53.027 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:07:53.027 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:53.027 INFO: A corpus is not provided, starting from an empty corpus 00:07:53.027 #2 INITED exec/s: 0 rss: 63Mb 00:07:53.027 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:53.027 This may also happen if the target rejected all inputs we tried so far 00:07:53.027 [2024-04-26 20:03:37.282309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.027 [2024-04-26 20:03:37.282350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.288 NEW_FUNC[1/671]: 0x482600 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:53.288 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.288 #16 NEW cov: 11734 ft: 11735 corp: 2/11b lim: 30 exec/s: 0 rss: 69Mb L: 10/10 MS: 4 CrossOver-ChangeBit-CrossOver-CMP- DE: "\001\000\000\000\000\000\000\000"- 00:07:53.288 [2024-04-26 20:03:37.612620] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (44040) > buf size (4096) 00:07:53.288 [2024-04-26 20:03:37.613112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.288 [2024-04-26 20:03:37.613152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.288 #20 NEW cov: 11873 ft: 12342 corp: 3/20b lim: 30 exec/s: 0 rss: 69Mb L: 9/10 MS: 4 CopyPart-ChangeBit-ChangeBit-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:53.288 [2024-04-26 20:03:37.663116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.288 [2024-04-26 20:03:37.663143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.288 #21 NEW cov: 11879 ft: 12526 corp: 4/28b lim: 30 exec/s: 0 rss: 69Mb L: 8/10 MS: 1 EraseBytes- 00:07:53.288 [2024-04-26 20:03:37.722940] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (44040) > buf size (4096) 00:07:53.288 [2024-04-26 20:03:37.723381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.288 [2024-04-26 20:03:37.723410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.547 #22 NEW cov: 11964 ft: 12780 corp: 5/37b lim: 30 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 ChangeBit- 00:07:53.547 [2024-04-26 20:03:37.783130] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45004) > buf size (4096) 00:07:53.547 [2024-04-26 20:03:37.783558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.783588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.547 #23 NEW cov: 11964 ft: 12872 corp: 6/47b lim: 30 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 InsertByte- 00:07:53.547 [2024-04-26 20:03:37.833768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.833794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.547 #24 NEW cov: 11964 ft: 13056 corp: 7/55b lim: 30 exec/s: 0 rss: 69Mb L: 8/10 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:53.547 [2024-04-26 20:03:37.883499] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (357552) > buf size (4096) 00:07:53.547 [2024-04-26 20:03:37.883991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:5d2b81f2 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.884017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.547 #25 NEW cov: 11964 ft: 13122 corp: 8/66b lim: 30 exec/s: 0 rss: 70Mb L: 11/11 MS: 1 InsertByte- 00:07:53.547 [2024-04-26 20:03:37.943829] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:53.547 [2024-04-26 20:03:37.944089] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:53.547 [2024-04-26 20:03:37.944330] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:53.547 [2024-04-26 20:03:37.944572] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:53.547 [2024-04-26 20:03:37.945011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.945038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.547 [2024-04-26 20:03:37.945128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.945146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.547 [2024-04-26 20:03:37.945227] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.945244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.547 [2024-04-26 20:03:37.945326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.547 [2024-04-26 20:03:37.945343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.547 #26 NEW cov: 11970 ft: 13808 corp: 9/92b lim: 30 exec/s: 0 rss: 70Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:53.806 [2024-04-26 20:03:38.004244] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:53.806 [2024-04-26 20:03:38.004921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.806 [2024-04-26 20:03:38.004959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.806 [2024-04-26 20:03:38.005048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.806 [2024-04-26 20:03:38.005065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.806 [2024-04-26 20:03:38.005146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.806 [2024-04-26 20:03:38.005162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.806 #27 NEW cov: 11970 ft: 14092 corp: 10/110b lim: 30 exec/s: 0 rss: 70Mb L: 18/26 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:53.806 [2024-04-26 20:03:38.054432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.806 [2024-04-26 20:03:38.054457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.806 #30 NEW cov: 11970 ft: 14126 corp: 11/119b lim: 30 exec/s: 0 rss: 70Mb L: 9/26 MS: 3 ChangeBit-ChangeByte-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:07:53.806 [2024-04-26 20:03:38.104388] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (44040) > buf size (4096) 00:07:53.806 [2024-04-26 20:03:38.105551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.806 [2024-04-26 20:03:38.105578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.806 [2024-04-26 20:03:38.105667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.105683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.105769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.105785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.105869] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00400000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.105889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.807 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.807 #31 NEW cov: 11993 ft: 14199 corp: 12/143b lim: 30 exec/s: 0 rss: 70Mb L: 24/26 MS: 1 InsertRepeatedBytes- 00:07:53.807 [2024-04-26 20:03:38.164742] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:53.807 [2024-04-26 20:03:38.165426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.165453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.165536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.165555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.165639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.165656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.807 #32 NEW cov: 11993 ft: 14231 corp: 13/161b lim: 30 exec/s: 0 rss: 70Mb L: 18/26 MS: 1 CMP- DE: "\010\000"- 00:07:53.807 [2024-04-26 20:03:38.224846] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:53.807 [2024-04-26 20:03:38.225122] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:53.807 [2024-04-26 20:03:38.225356] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:53.807 [2024-04-26 20:03:38.225596] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (456440) > buf size (4096) 00:07:53.807 [2024-04-26 20:03:38.226283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b0181bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.226309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.226403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.226421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.226507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.226524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.226618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.226635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:53.807 [2024-04-26 20:03:38.226721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:53.807 [2024-04-26 20:03:38.226737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.066 #33 NEW cov: 11993 ft: 14307 corp: 14/191b lim: 30 exec/s: 33 rss: 70Mb L: 30/30 MS: 1 InsertRepeatedBytes- 00:07:54.066 [2024-04-26 20:03:38.274728] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45004) > buf size (4096) 00:07:54.066 [2024-04-26 20:03:38.275177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.275204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.066 #34 NEW cov: 11993 ft: 14334 corp: 15/201b lim: 30 exec/s: 34 rss: 70Mb L: 10/30 MS: 1 PersAutoDict- DE: "\010\000"- 00:07:54.066 [2024-04-26 20:03:38.325130] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45004) > buf size (4096) 00:07:54.066 [2024-04-26 20:03:38.325369] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:54.066 [2024-04-26 20:03:38.325609] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.066 [2024-04-26 20:03:38.325843] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.066 [2024-04-26 20:03:38.326271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.326299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.066 [2024-04-26 20:03:38.326396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00080000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.326414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.066 [2024-04-26 20:03:38.326505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.326523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.066 [2024-04-26 20:03:38.326607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.326625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.066 #35 NEW cov: 11993 ft: 14351 corp: 16/229b lim: 30 exec/s: 35 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:54.066 [2024-04-26 20:03:38.385153] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45004) > buf size (4096) 00:07:54.066 [2024-04-26 20:03:38.385645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.066 [2024-04-26 20:03:38.385674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.066 #36 NEW cov: 11993 ft: 14379 corp: 17/239b lim: 30 exec/s: 36 rss: 70Mb L: 10/30 MS: 1 ChangeByte- 00:07:54.066 [2024-04-26 20:03:38.435516] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (357552) > buf size (4096) 00:07:54.066 [2024-04-26 20:03:38.435765] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009797 00:07:54.066 [2024-04-26 20:03:38.436080] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009797 00:07:54.066 [2024-04-26 20:03:38.436383] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x300009797 00:07:54.067 [2024-04-26 20:03:38.436892] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:5d2b81f2 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.436932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.437021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.437039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.437119] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:97978397 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.437135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.437225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:97978397 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.437242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.067 #37 NEW cov: 11993 ft: 14480 corp: 18/267b lim: 30 exec/s: 37 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:54.067 [2024-04-26 20:03:38.495726] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff 00:07:54.067 [2024-04-26 20:03:38.496002] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.067 [2024-04-26 20:03:38.496243] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.067 [2024-04-26 20:03:38.496480] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:54.067 [2024-04-26 20:03:38.496924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.496951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.497039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.497058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.497141] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.497159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.067 [2024-04-26 20:03:38.497246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.067 [2024-04-26 20:03:38.497263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.326 #38 NEW cov: 11993 ft: 14485 corp: 19/293b lim: 30 exec/s: 38 rss: 70Mb L: 26/30 MS: 1 ChangeBit- 00:07:54.326 [2024-04-26 20:03:38.555784] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (569292) > buf size (4096) 00:07:54.326 [2024-04-26 20:03:38.556275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20201 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.556303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.326 #39 NEW cov: 11993 ft: 14498 corp: 20/303b lim: 30 exec/s: 39 rss: 70Mb L: 10/30 MS: 1 ChangeBinInt- 00:07:54.326 [2024-04-26 20:03:38.606065] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x2b01 00:07:54.326 [2024-04-26 20:03:38.606805] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (64) > len (4) 00:07:54.326 [2024-04-26 20:03:38.607256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.607285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.607377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.607395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.607487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.607504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.607589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.607607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.326 #40 NEW cov: 11999 ft: 14528 corp: 21/331b lim: 30 exec/s: 40 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:54.326 [2024-04-26 20:03:38.666436] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (524292) > buf size (4096) 00:07:54.326 [2024-04-26 20:03:38.667149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.667177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.667265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:0000020a cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.667282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.667378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.667395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.326 #41 NEW cov: 11999 ft: 14552 corp: 22/349b lim: 30 exec/s: 41 rss: 70Mb L: 18/30 MS: 1 ChangeBit- 00:07:54.326 [2024-04-26 20:03:38.716276] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (2048) > len (1028) 00:07:54.326 [2024-04-26 20:03:38.716715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.716744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.326 #42 NEW cov: 11999 ft: 14571 corp: 23/357b lim: 30 exec/s: 42 rss: 70Mb L: 8/30 MS: 1 ChangeBinInt- 00:07:54.326 [2024-04-26 20:03:38.766748] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (10244) > buf size (4096) 00:07:54.326 [2024-04-26 20:03:38.767949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.767974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.768061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.768079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.768165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.768182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.326 [2024-04-26 20:03:38.768267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.326 [2024-04-26 20:03:38.768285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.585 #43 NEW cov: 11999 ft: 14594 corp: 24/385b lim: 30 exec/s: 43 rss: 70Mb L: 28/30 MS: 1 InsertRepeatedBytes- 00:07:54.585 [2024-04-26 20:03:38.816717] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:54.585 [2024-04-26 20:03:38.816991] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:54.585 [2024-04-26 20:03:38.817239] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (193540) > buf size (4096) 00:07:54.585 [2024-04-26 20:03:38.817681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:010081bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.817707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.585 [2024-04-26 20:03:38.817797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.817815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.585 [2024-04-26 20:03:38.817904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:bd000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.817921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.585 #44 NEW cov: 11999 ft: 14666 corp: 25/407b lim: 30 exec/s: 44 rss: 71Mb L: 22/30 MS: 1 CrossOver- 00:07:54.585 [2024-04-26 20:03:38.876906] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (2048) > len (1028) 00:07:54.585 [2024-04-26 20:03:38.877378] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.877405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.585 #45 NEW cov: 11999 ft: 14726 corp: 26/415b lim: 30 exec/s: 45 rss: 71Mb L: 8/30 MS: 1 ChangeBinInt- 00:07:54.585 [2024-04-26 20:03:38.927081] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (2056) > len (1028) 00:07:54.585 [2024-04-26 20:03:38.927510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.927537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.585 #46 NEW cov: 11999 ft: 14742 corp: 27/423b lim: 30 exec/s: 46 rss: 71Mb L: 8/30 MS: 1 ChangeBinInt- 00:07:54.585 [2024-04-26 20:03:38.977567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.585 [2024-04-26 20:03:38.977592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.585 #47 NEW cov: 11999 ft: 14764 corp: 28/432b lim: 30 exec/s: 47 rss: 71Mb L: 9/30 MS: 1 ChangeBit- 00:07:54.585 [2024-04-26 20:03:39.027490] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (357552) > buf size (4096) 00:07:54.585 [2024-04-26 20:03:39.027774] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (256) > len (36) 00:07:54.585 [2024-04-26 20:03:39.028030] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x9797 00:07:54.845 [2024-04-26 20:03:39.028471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:5d2b81f2 cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.028497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.028583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00080008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.028602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.028688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.028704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.845 #48 NEW cov: 11999 ft: 14769 corp: 29/455b lim: 30 exec/s: 48 rss: 71Mb L: 23/30 MS: 1 CrossOver- 00:07:54.845 [2024-04-26 20:03:39.087537] ctrlr.c:2605:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: offset (2048) > len (1028) 00:07:54.845 [2024-04-26 20:03:39.087975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:01000008 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.088006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.845 #49 NEW cov: 11999 ft: 14782 corp: 30/463b lim: 30 exec/s: 49 rss: 71Mb L: 8/30 MS: 1 ChangeBinInt- 00:07:54.845 [2024-04-26 20:03:39.137912] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (45004) > buf size (4096) 00:07:54.845 [2024-04-26 20:03:39.138176] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:54.845 [2024-04-26 20:03:39.138428] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.845 [2024-04-26 20:03:39.138668] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:54.845 [2024-04-26 20:03:39.139113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2bf20001 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.139139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.139224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:08000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.139243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.139331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.139347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.139429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.139444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.845 #50 NEW cov: 11999 ft: 14810 corp: 31/491b lim: 30 exec/s: 50 rss: 72Mb L: 28/30 MS: 1 ShuffleBytes- 00:07:54.845 [2024-04-26 20:03:39.198115] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:54.845 [2024-04-26 20:03:39.198364] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:54.845 [2024-04-26 20:03:39.198620] ctrlr.c:2562:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x10000bdbd 00:07:54.845 [2024-04-26 20:03:39.198886] ctrlr.c:2574:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (456440) > buf size (4096) 00:07:54.845 [2024-04-26 20:03:39.199599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:2b0181bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.199626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.199717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.199735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.199826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.199843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.199939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:bdbd81bd cdw11:00000001 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.199956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.845 [2024-04-26 20:03:39.200054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.200073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:54.845 #51 NEW cov: 11999 ft: 14821 corp: 32/521b lim: 30 exec/s: 51 rss: 72Mb L: 30/30 MS: 1 ChangeByte- 00:07:54.845 [2024-04-26 20:03:39.258617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0100000a cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:54.845 [2024-04-26 20:03:39.258644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.845 #52 NEW cov: 11999 ft: 14869 corp: 33/531b lim: 30 exec/s: 26 rss: 72Mb L: 10/30 MS: 1 CrossOver- 00:07:54.845 #52 DONE cov: 11999 ft: 14869 corp: 33/531b lim: 30 exec/s: 26 rss: 72Mb 00:07:54.845 ###### Recommended dictionary. ###### 00:07:54.845 "\001\000\000\000\000\000\000\000" # Uses: 4 00:07:54.845 "\010\000" # Uses: 1 00:07:54.845 ###### End of recommended dictionary. ###### 00:07:54.845 Done 52 runs in 2 second(s) 00:07:55.104 20:03:39 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_1.conf /var/tmp/suppress_nvmf_fuzz 00:07:55.104 20:03:39 -- ../common.sh@72 -- # (( i++ )) 00:07:55.104 20:03:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.104 20:03:39 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:55.104 20:03:39 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:55.104 20:03:39 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.104 20:03:39 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.104 20:03:39 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:55.104 20:03:39 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:55.104 20:03:39 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:55.104 20:03:39 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:55.104 20:03:39 -- nvmf/run.sh@34 -- # printf %02d 2 00:07:55.104 20:03:39 -- nvmf/run.sh@34 -- # port=4402 00:07:55.104 20:03:39 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:55.104 20:03:39 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:55.104 20:03:39 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.104 20:03:39 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:55.104 20:03:39 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:55.104 20:03:39 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 00:07:55.104 [2024-04-26 20:03:39.447950] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:55.104 [2024-04-26 20:03:39.448043] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1621756 ] 00:07:55.104 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.363 [2024-04-26 20:03:39.639800] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.363 [2024-04-26 20:03:39.711196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.363 [2024-04-26 20:03:39.770458] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.363 [2024-04-26 20:03:39.786675] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:55.363 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.363 INFO: Seed: 1936667287 00:07:55.621 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:07:55.621 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:07:55.621 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:55.621 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.621 #2 INITED exec/s: 0 rss: 63Mb 00:07:55.622 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.622 This may also happen if the target rejected all inputs we tried so far 00:07:55.622 [2024-04-26 20:03:39.842239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8f8f000b cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.622 [2024-04-26 20:03:39.842268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.622 [2024-04-26 20:03:39.842339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.622 [2024-04-26 20:03:39.842353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.622 [2024-04-26 20:03:39.842403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.622 [2024-04-26 20:03:39.842416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.622 [2024-04-26 20:03:39.842467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.622 [2024-04-26 20:03:39.842480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.880 NEW_FUNC[1/670]: 0x4850b0 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:55.880 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.880 #4 NEW cov: 11665 ft: 11666 corp: 2/32b lim: 35 exec/s: 0 rss: 69Mb L: 31/31 MS: 2 ChangeBit-InsertRepeatedBytes- 00:07:55.880 [2024-04-26 20:03:40.183194] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.183233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.880 [2024-04-26 20:03:40.183287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.183301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.880 [2024-04-26 20:03:40.183352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.183365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.880 [2024-04-26 20:03:40.183416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.183430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.880 #11 NEW cov: 11795 ft: 12128 corp: 3/65b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:55.880 [2024-04-26 20:03:40.222634] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:55.880 [2024-04-26 20:03:40.222768] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:55.880 [2024-04-26 20:03:40.222977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.223004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.880 [2024-04-26 20:03:40.223060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.880 [2024-04-26 20:03:40.223075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.880 #14 NEW cov: 11810 ft: 12990 corp: 4/82b lim: 35 exec/s: 0 rss: 69Mb L: 17/33 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:55.881 [2024-04-26 20:03:40.263321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.263345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.263413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.263427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.263480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:422a0042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.263493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.263543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.263557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.881 #15 NEW cov: 11895 ft: 13252 corp: 5/115b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 1 ChangeByte- 00:07:55.881 [2024-04-26 20:03:40.313412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.313436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.313493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.313507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.313557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.313570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.881 [2024-04-26 20:03:40.313620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:55.881 [2024-04-26 20:03:40.313633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.139 #16 NEW cov: 11895 ft: 13414 corp: 6/149b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CopyPart- 00:07:56.139 [2024-04-26 20:03:40.353180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.353206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.139 #20 NEW cov: 11895 ft: 13788 corp: 7/162b lim: 35 exec/s: 0 rss: 69Mb L: 13/34 MS: 4 CopyPart-CrossOver-ChangeByte-CrossOver- 00:07:56.139 [2024-04-26 20:03:40.393282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.393322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.139 #21 NEW cov: 11895 ft: 13856 corp: 8/175b lim: 35 exec/s: 0 rss: 70Mb L: 13/34 MS: 1 ChangeBit- 00:07:56.139 [2024-04-26 20:03:40.443940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.443967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.444021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.444035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.444103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.444117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.444169] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.444183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.444234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.444248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.139 #22 NEW cov: 11895 ft: 13952 corp: 9/210b lim: 35 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:07:56.139 [2024-04-26 20:03:40.493905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.493930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.493984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.493997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.494065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.494079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.494131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.494144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.139 #23 NEW cov: 11895 ft: 14051 corp: 10/238b lim: 35 exec/s: 0 rss: 70Mb L: 28/35 MS: 1 EraseBytes- 00:07:56.139 [2024-04-26 20:03:40.534007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a42000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.534033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.534103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.534121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.534184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.534197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.139 [2024-04-26 20:03:40.534250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.534264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.139 #25 NEW cov: 11895 ft: 14082 corp: 11/269b lim: 35 exec/s: 0 rss: 70Mb L: 31/35 MS: 2 InsertByte-CrossOver- 00:07:56.139 [2024-04-26 20:03:40.573792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.139 [2024-04-26 20:03:40.573816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.398 #26 NEW cov: 11895 ft: 14101 corp: 12/282b lim: 35 exec/s: 0 rss: 70Mb L: 13/35 MS: 1 ShuffleBytes- 00:07:56.398 [2024-04-26 20:03:40.624182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a42000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.624206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.624259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.624273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.624324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.624338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.398 #27 NEW cov: 11895 ft: 14306 corp: 13/305b lim: 35 exec/s: 0 rss: 70Mb L: 23/35 MS: 1 EraseBytes- 00:07:56.398 [2024-04-26 20:03:40.664427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.664452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.664503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.664516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.664567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:422b0042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.664580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.664631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.664644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.398 #28 NEW cov: 11895 ft: 14355 corp: 14/339b lim: 35 exec/s: 0 rss: 70Mb L: 34/35 MS: 1 InsertByte- 00:07:56.398 [2024-04-26 20:03:40.704497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a42000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.704525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.704606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:02420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.704619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.398 [2024-04-26 20:03:40.704669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.398 [2024-04-26 20:03:40.704682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.398 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.398 #29 NEW cov: 11918 ft: 14389 corp: 15/366b lim: 35 exec/s: 0 rss: 70Mb L: 27/35 MS: 1 CrossOver- 00:07:56.399 [2024-04-26 20:03:40.754675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8f8f000b cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.754700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.754769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.754784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.754836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.754849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.754909] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.754923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.399 #30 NEW cov: 11918 ft: 14432 corp: 16/398b lim: 35 exec/s: 0 rss: 70Mb L: 32/35 MS: 1 InsertByte- 00:07:56.399 [2024-04-26 20:03:40.794785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.794810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.794861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.794881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.794939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4242002e cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.794954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.399 [2024-04-26 20:03:40.795007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.795021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.399 #31 NEW cov: 11918 ft: 14449 corp: 17/432b lim: 35 exec/s: 0 rss: 70Mb L: 34/35 MS: 1 ChangeByte- 00:07:56.399 [2024-04-26 20:03:40.834526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42004260 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.399 [2024-04-26 20:03:40.834560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.657 #32 NEW cov: 11918 ft: 14488 corp: 18/445b lim: 35 exec/s: 32 rss: 70Mb L: 13/35 MS: 1 ChangeByte- 00:07:56.657 [2024-04-26 20:03:40.885075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:7f7f000a cdw11:7f007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.885100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.885154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:7f7f007f cdw11:0a007f7f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.885167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.885219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.885232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.885282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.885295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.657 #33 NEW cov: 11918 ft: 14506 corp: 19/479b lim: 35 exec/s: 33 rss: 70Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:07:56.657 [2024-04-26 20:03:40.925127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:bd0042c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.925151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.925205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:424200bd cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.925219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.925275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:422b0042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.925289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.925338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.925351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.657 #34 NEW cov: 11918 ft: 14513 corp: 20/513b lim: 35 exec/s: 34 rss: 71Mb L: 34/35 MS: 1 ChangeBinInt- 00:07:56.657 [2024-04-26 20:03:40.965246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.965270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.965339] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.965353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.965406] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.965423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.657 [2024-04-26 20:03:40.965475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.657 [2024-04-26 20:03:40.965489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.657 #35 NEW cov: 11918 ft: 14577 corp: 21/541b lim: 35 exec/s: 35 rss: 71Mb L: 28/35 MS: 1 CopyPart- 00:07:56.657 [2024-04-26 20:03:41.005377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:8f8f000b cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.005401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.658 [2024-04-26 20:03:41.005471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:8f2f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.005487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.658 [2024-04-26 20:03:41.005536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.005550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.658 [2024-04-26 20:03:41.005599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.005612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.658 #36 NEW cov: 11918 ft: 14596 corp: 22/572b lim: 35 exec/s: 36 rss: 71Mb L: 31/35 MS: 1 ChangeByte- 00:07:56.658 [2024-04-26 20:03:41.045199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:02004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.045225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.658 #37 NEW cov: 11918 ft: 14640 corp: 23/583b lim: 35 exec/s: 37 rss: 71Mb L: 11/35 MS: 1 EraseBytes- 00:07:56.658 [2024-04-26 20:03:41.085476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a42000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.085501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.658 [2024-04-26 20:03:41.085553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:02420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.085567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.658 [2024-04-26 20:03:41.085619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.658 [2024-04-26 20:03:41.085633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 #38 NEW cov: 11918 ft: 14652 corp: 24/608b lim: 35 exec/s: 38 rss: 71Mb L: 25/35 MS: 1 EraseBytes- 00:07:56.917 [2024-04-26 20:03:41.135392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42002c42 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.135416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 #39 NEW cov: 11918 ft: 14695 corp: 25/621b lim: 35 exec/s: 39 rss: 71Mb L: 13/35 MS: 1 ChangeByte- 00:07:56.917 [2024-04-26 20:03:41.175883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.175908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.175978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.175992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.176045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:422a0042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.176059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.176113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420031 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.176128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.917 #40 NEW cov: 11918 ft: 14707 corp: 26/654b lim: 35 exec/s: 40 rss: 71Mb L: 33/35 MS: 1 ChangeByte- 00:07:56.917 [2024-04-26 20:03:41.226092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.226117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.226172] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.226186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.226256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:02004202 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.226270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.226383] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.226398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.917 #41 NEW cov: 11918 ft: 14740 corp: 27/689b lim: 35 exec/s: 41 rss: 71Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:07:56.917 [2024-04-26 20:03:41.266105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:bd0042c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.266129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.266197] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:424200bd cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.266211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.266265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:422b0042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.266278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.266331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.266360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.917 #42 NEW cov: 11918 ft: 14777 corp: 28/723b lim: 35 exec/s: 42 rss: 72Mb L: 34/35 MS: 1 ChangeBit- 00:07:56.917 [2024-04-26 20:03:41.306225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:bd0042c5 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.306249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.306316] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:424200bd cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.306330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.306381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:4200422b SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.306393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.306445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.306458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.917 #43 NEW cov: 11918 ft: 14790 corp: 29/757b lim: 35 exec/s: 43 rss: 72Mb L: 34/35 MS: 1 ShuffleBytes- 00:07:56.917 [2024-04-26 20:03:41.346302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:7e004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.346327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.346396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.346410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.346463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.346477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.917 [2024-04-26 20:03:41.346527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:56.917 [2024-04-26 20:03:41.346541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.176 #44 NEW cov: 11918 ft: 14843 corp: 30/785b lim: 35 exec/s: 44 rss: 72Mb L: 28/35 MS: 1 ChangeByte- 00:07:57.176 [2024-04-26 20:03:41.386346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a42000a cdw11:68004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.386371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.386424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:02420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.386438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.386488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.386505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 #45 NEW cov: 11918 ft: 14856 corp: 31/810b lim: 35 exec/s: 45 rss: 72Mb L: 25/35 MS: 1 ChangeByte- 00:07:57.176 [2024-04-26 20:03:41.426597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.426619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.426689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.426703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.426755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4242002e cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.426768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.426818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:ff004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.426831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.176 #46 NEW cov: 11918 ft: 14881 corp: 32/844b lim: 35 exec/s: 46 rss: 72Mb L: 34/35 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:07:57.176 [2024-04-26 20:03:41.466654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.466677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.466747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.466761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.466813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.466826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.466885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.466904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.176 #47 NEW cov: 11918 ft: 14899 corp: 33/872b lim: 35 exec/s: 47 rss: 72Mb L: 28/35 MS: 1 ShuffleBytes- 00:07:57.176 [2024-04-26 20:03:41.496602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.496626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.496695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.496709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.496761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4242002e cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.496778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 #48 NEW cov: 11918 ft: 14915 corp: 34/895b lim: 35 exec/s: 48 rss: 72Mb L: 23/35 MS: 1 EraseBytes- 00:07:57.176 [2024-04-26 20:03:41.536836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:908f000b cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.536860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.536934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.536948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.536999] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.537013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.537066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:8f8f008f cdw11:8f008f8f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.537080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.176 #49 NEW cov: 11918 ft: 14922 corp: 35/927b lim: 35 exec/s: 49 rss: 72Mb L: 32/35 MS: 1 ChangeBinInt- 00:07:57.176 [2024-04-26 20:03:41.576996] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.577020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.577087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.577101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.577153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:4242002e cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.577166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.577218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.577231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.176 #50 NEW cov: 11918 ft: 14929 corp: 36/961b lim: 35 exec/s: 50 rss: 72Mb L: 34/35 MS: 1 CrossOver- 00:07:57.176 [2024-04-26 20:03:41.617181] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.617206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.617257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.617272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.617324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.617341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.176 [2024-04-26 20:03:41.617394] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.176 [2024-04-26 20:03:41.617407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.435 #51 NEW cov: 11918 ft: 14935 corp: 37/995b lim: 35 exec/s: 51 rss: 72Mb L: 34/35 MS: 1 ChangeByte- 00:07:57.435 [2024-04-26 20:03:41.656672] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.435 [2024-04-26 20:03:41.656801] ctrlr.c:2656:_nvmf_ctrlr_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:57.435 [2024-04-26 20:03:41.657010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.657036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.435 [2024-04-26 20:03:41.657089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.657105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.435 #52 NEW cov: 11918 ft: 14941 corp: 38/1009b lim: 35 exec/s: 52 rss: 72Mb L: 14/35 MS: 1 EraseBytes- 00:07:57.435 [2024-04-26 20:03:41.697348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.697372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.435 [2024-04-26 20:03:41.697426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.697439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.435 [2024-04-26 20:03:41.697490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.697503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.435 [2024-04-26 20:03:41.697553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42003142 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.697566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.435 #53 NEW cov: 11918 ft: 14949 corp: 39/1038b lim: 35 exec/s: 53 rss: 72Mb L: 29/35 MS: 1 InsertByte- 00:07:57.435 [2024-04-26 20:03:41.737560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.435 [2024-04-26 20:03:41.737583] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.737654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.737668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.737719] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.737735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.737787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.737801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.737851] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.737864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.436 #54 NEW cov: 11918 ft: 14956 corp: 40/1073b lim: 35 exec/s: 54 rss: 72Mb L: 35/35 MS: 1 CopyPart- 00:07:57.436 [2024-04-26 20:03:41.777320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:60420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.777345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.436 #55 NEW cov: 11918 ft: 14959 corp: 41/1088b lim: 35 exec/s: 55 rss: 72Mb L: 15/35 MS: 1 CopyPart- 00:07:57.436 [2024-04-26 20:03:41.817768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:4242000a cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.817791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.817863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.817884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.817942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.817956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.818008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.818023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.436 [2024-04-26 20:03:41.818076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:8 nsid:0 cdw10:42420042 cdw11:42004242 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.436 [2024-04-26 20:03:41.818090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:57.436 #56 NEW cov: 11918 ft: 14986 corp: 42/1123b lim: 35 exec/s: 28 rss: 72Mb L: 35/35 MS: 1 ShuffleBytes- 00:07:57.436 #56 DONE cov: 11918 ft: 14986 corp: 42/1123b lim: 35 exec/s: 28 rss: 72Mb 00:07:57.436 ###### Recommended dictionary. ###### 00:07:57.436 "\377\377\377\377\377\377\377\377" # Uses: 0 00:07:57.436 ###### End of recommended dictionary. ###### 00:07:57.436 Done 56 runs in 2 second(s) 00:07:57.695 20:03:41 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_2.conf /var/tmp/suppress_nvmf_fuzz 00:07:57.695 20:03:41 -- ../common.sh@72 -- # (( i++ )) 00:07:57.695 20:03:41 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.695 20:03:41 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:57.695 20:03:41 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:57.695 20:03:41 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.695 20:03:41 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.695 20:03:41 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:57.695 20:03:41 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:57.695 20:03:41 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:07:57.695 20:03:41 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:07:57.695 20:03:41 -- nvmf/run.sh@34 -- # printf %02d 3 00:07:57.695 20:03:41 -- nvmf/run.sh@34 -- # port=4403 00:07:57.695 20:03:41 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:57.695 20:03:41 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:57.695 20:03:41 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.695 20:03:41 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:07:57.695 20:03:41 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:07:57.695 20:03:41 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 00:07:57.695 [2024-04-26 20:03:42.004075] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:07:57.695 [2024-04-26 20:03:42.004153] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622115 ] 00:07:57.695 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.953 [2024-04-26 20:03:42.200023] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.953 [2024-04-26 20:03:42.271326] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.954 [2024-04-26 20:03:42.330458] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.954 [2024-04-26 20:03:42.346627] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:57.954 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.954 INFO: Seed: 202695174 00:07:57.954 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:07:57.954 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:07:57.954 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:57.954 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.954 #2 INITED exec/s: 0 rss: 63Mb 00:07:57.954 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.954 This may also happen if the target rejected all inputs we tried so far 00:07:58.471 NEW_FUNC[1/659]: 0x486d80 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:58.471 NEW_FUNC[2/659]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.471 #5 NEW cov: 11570 ft: 11571 corp: 2/9b lim: 20 exec/s: 0 rss: 69Mb L: 8/8 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:07:58.471 #16 NEW cov: 11707 ft: 12138 corp: 3/17b lim: 20 exec/s: 0 rss: 69Mb L: 8/8 MS: 1 ChangeBit- 00:07:58.471 #21 NEW cov: 11713 ft: 12614 corp: 4/22b lim: 20 exec/s: 0 rss: 69Mb L: 5/8 MS: 5 ChangeBinInt-ChangeByte-InsertByte-EraseBytes-CMP- DE: "\017\000\000\000"- 00:07:58.471 [2024-04-26 20:03:42.823401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.471 [2024-04-26 20:03:42.823442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.471 NEW_FUNC[1/20]: 0x115f700 in nvmf_qpair_abort_request /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3283 00:07:58.471 NEW_FUNC[2/20]: 0x1160280 in nvmf_qpair_abort_aer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:3225 00:07:58.471 #22 NEW cov: 12123 ft: 13615 corp: 5/37b lim: 20 exec/s: 0 rss: 69Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:07:58.471 [2024-04-26 20:03:42.873889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.471 [2024-04-26 20:03:42.873921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.471 #23 NEW cov: 12139 ft: 14109 corp: 6/57b lim: 20 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:58.729 #24 NEW cov: 12140 ft: 14318 corp: 7/74b lim: 20 exec/s: 0 rss: 70Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:07:58.729 [2024-04-26 20:03:42.964184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.729 [2024-04-26 20:03:42.964209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.729 #25 NEW cov: 12143 ft: 14418 corp: 8/94b lim: 20 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 ChangeBit- 00:07:58.729 [2024-04-26 20:03:43.014253] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:58.729 [2024-04-26 20:03:43.014278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.729 #26 NEW cov: 12143 ft: 14488 corp: 9/114b lim: 20 exec/s: 0 rss: 70Mb L: 20/20 MS: 1 ShuffleBytes- 00:07:58.729 #27 NEW cov: 12143 ft: 14500 corp: 10/123b lim: 20 exec/s: 0 rss: 70Mb L: 9/20 MS: 1 InsertByte- 00:07:58.729 #28 NEW cov: 12143 ft: 14579 corp: 11/131b lim: 20 exec/s: 0 rss: 70Mb L: 8/20 MS: 1 ChangeBinInt- 00:07:58.729 #29 NEW cov: 12143 ft: 14655 corp: 12/136b lim: 20 exec/s: 0 rss: 70Mb L: 5/20 MS: 1 CopyPart- 00:07:58.988 #30 NEW cov: 12143 ft: 14670 corp: 13/142b lim: 20 exec/s: 0 rss: 70Mb L: 6/20 MS: 1 CrossOver- 00:07:58.988 #31 NEW cov: 12143 ft: 14728 corp: 14/151b lim: 20 exec/s: 0 rss: 70Mb L: 9/20 MS: 1 CrossOver- 00:07:58.988 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.988 #32 NEW cov: 12166 ft: 14788 corp: 15/160b lim: 20 exec/s: 0 rss: 70Mb L: 9/20 MS: 1 CrossOver- 00:07:58.988 #33 NEW cov: 12166 ft: 14812 corp: 16/168b lim: 20 exec/s: 0 rss: 70Mb L: 8/20 MS: 1 PersAutoDict- DE: "\017\000\000\000"- 00:07:58.988 #34 NEW cov: 12166 ft: 14825 corp: 17/173b lim: 20 exec/s: 0 rss: 70Mb L: 5/20 MS: 1 ChangeByte- 00:07:58.988 #35 NEW cov: 12166 ft: 14910 corp: 18/178b lim: 20 exec/s: 35 rss: 70Mb L: 5/20 MS: 1 ChangeBinInt- 00:07:59.247 #36 NEW cov: 12166 ft: 14914 corp: 19/184b lim: 20 exec/s: 36 rss: 70Mb L: 6/20 MS: 1 ChangeBinInt- 00:07:59.247 [2024-04-26 20:03:43.465450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.247 [2024-04-26 20:03:43.465479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.247 #37 NEW cov: 12166 ft: 14994 corp: 20/202b lim: 20 exec/s: 37 rss: 71Mb L: 18/20 MS: 1 EraseBytes- 00:07:59.247 #38 NEW cov: 12166 ft: 15029 corp: 21/215b lim: 20 exec/s: 38 rss: 71Mb L: 13/20 MS: 1 CopyPart- 00:07:59.247 #39 NEW cov: 12166 ft: 15037 corp: 22/229b lim: 20 exec/s: 39 rss: 71Mb L: 14/20 MS: 1 InsertRepeatedBytes- 00:07:59.247 #40 NEW cov: 12166 ft: 15059 corp: 23/243b lim: 20 exec/s: 40 rss: 71Mb L: 14/20 MS: 1 ChangeByte- 00:07:59.247 #41 NEW cov: 12166 ft: 15075 corp: 24/256b lim: 20 exec/s: 41 rss: 71Mb L: 13/20 MS: 1 CrossOver- 00:07:59.504 #42 NEW cov: 12166 ft: 15127 corp: 25/264b lim: 20 exec/s: 42 rss: 71Mb L: 8/20 MS: 1 PersAutoDict- DE: "\017\000\000\000"- 00:07:59.504 [2024-04-26 20:03:43.716232] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.504 [2024-04-26 20:03:43.716260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.504 #43 NEW cov: 12166 ft: 15173 corp: 26/282b lim: 20 exec/s: 43 rss: 71Mb L: 18/20 MS: 1 EraseBytes- 00:07:59.504 #44 NEW cov: 12166 ft: 15186 corp: 27/292b lim: 20 exec/s: 44 rss: 71Mb L: 10/20 MS: 1 InsertByte- 00:07:59.504 [2024-04-26 20:03:43.796245] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.504 [2024-04-26 20:03:43.796274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.504 #45 NEW cov: 12166 ft: 15224 corp: 28/307b lim: 20 exec/s: 45 rss: 71Mb L: 15/20 MS: 1 ChangeBit- 00:07:59.504 [2024-04-26 20:03:43.836564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.504 [2024-04-26 20:03:43.836591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.504 #46 NEW cov: 12166 ft: 15305 corp: 29/324b lim: 20 exec/s: 46 rss: 72Mb L: 17/20 MS: 1 EraseBytes- 00:07:59.504 #47 NEW cov: 12166 ft: 15319 corp: 30/338b lim: 20 exec/s: 47 rss: 72Mb L: 14/20 MS: 1 CopyPart- 00:07:59.762 #48 NEW cov: 12166 ft: 15320 corp: 31/352b lim: 20 exec/s: 48 rss: 72Mb L: 14/20 MS: 1 CopyPart- 00:07:59.762 #49 NEW cov: 12166 ft: 15347 corp: 32/366b lim: 20 exec/s: 49 rss: 72Mb L: 14/20 MS: 1 InsertByte- 00:07:59.762 [2024-04-26 20:03:44.007045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.762 [2024-04-26 20:03:44.007072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.762 #50 NEW cov: 12166 ft: 15357 corp: 33/383b lim: 20 exec/s: 50 rss: 72Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:59.762 [2024-04-26 20:03:44.057188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:59.762 [2024-04-26 20:03:44.057213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.762 #51 NEW cov: 12166 ft: 15364 corp: 34/402b lim: 20 exec/s: 51 rss: 72Mb L: 19/20 MS: 1 PersAutoDict- DE: "\017\000\000\000"- 00:07:59.762 #52 NEW cov: 12166 ft: 15370 corp: 35/415b lim: 20 exec/s: 52 rss: 72Mb L: 13/20 MS: 1 PersAutoDict- DE: "\017\000\000\000"- 00:07:59.762 #53 NEW cov: 12166 ft: 15383 corp: 36/432b lim: 20 exec/s: 53 rss: 72Mb L: 17/20 MS: 1 InsertRepeatedBytes- 00:07:59.762 #54 NEW cov: 12166 ft: 15389 corp: 37/440b lim: 20 exec/s: 54 rss: 72Mb L: 8/20 MS: 1 PersAutoDict- DE: "\017\000\000\000"- 00:08:00.020 #55 NEW cov: 12166 ft: 15398 corp: 38/445b lim: 20 exec/s: 55 rss: 72Mb L: 5/20 MS: 1 ShuffleBytes- 00:08:00.020 [2024-04-26 20:03:44.257834] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:08:00.020 [2024-04-26 20:03:44.257859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.020 #56 NEW cov: 12166 ft: 15430 corp: 39/465b lim: 20 exec/s: 56 rss: 72Mb L: 20/20 MS: 1 CrossOver- 00:08:00.020 #57 NEW cov: 12166 ft: 15435 corp: 40/480b lim: 20 exec/s: 57 rss: 72Mb L: 15/20 MS: 1 InsertByte- 00:08:00.020 #58 NEW cov: 12166 ft: 15453 corp: 41/494b lim: 20 exec/s: 58 rss: 72Mb L: 14/20 MS: 1 ChangeBinInt- 00:08:00.020 #59 NEW cov: 12166 ft: 15483 corp: 42/499b lim: 20 exec/s: 29 rss: 72Mb L: 5/20 MS: 1 EraseBytes- 00:08:00.020 #59 DONE cov: 12166 ft: 15483 corp: 42/499b lim: 20 exec/s: 29 rss: 72Mb 00:08:00.020 ###### Recommended dictionary. ###### 00:08:00.020 "\017\000\000\000" # Uses: 5 00:08:00.020 ###### End of recommended dictionary. ###### 00:08:00.020 Done 59 runs in 2 second(s) 00:08:00.280 20:03:44 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_3.conf /var/tmp/suppress_nvmf_fuzz 00:08:00.280 20:03:44 -- ../common.sh@72 -- # (( i++ )) 00:08:00.280 20:03:44 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.280 20:03:44 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:00.280 20:03:44 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:08:00.280 20:03:44 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.280 20:03:44 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.280 20:03:44 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.280 20:03:44 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:08:00.280 20:03:44 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:00.280 20:03:44 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:00.280 20:03:44 -- nvmf/run.sh@34 -- # printf %02d 4 00:08:00.280 20:03:44 -- nvmf/run.sh@34 -- # port=4404 00:08:00.280 20:03:44 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.280 20:03:44 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:08:00.280 20:03:44 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.280 20:03:44 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:00.280 20:03:44 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:00.280 20:03:44 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 00:08:00.280 [2024-04-26 20:03:44.561281] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:00.280 [2024-04-26 20:03:44.561350] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622469 ] 00:08:00.280 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.538 [2024-04-26 20:03:44.762825] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.539 [2024-04-26 20:03:44.833909] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.539 [2024-04-26 20:03:44.893198] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.539 [2024-04-26 20:03:44.909404] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:08:00.539 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.539 INFO: Seed: 2763696867 00:08:00.539 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:00.539 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:00.539 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:08:00.539 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.539 #2 INITED exec/s: 0 rss: 63Mb 00:08:00.539 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.539 This may also happen if the target rejected all inputs we tried so far 00:08:00.539 [2024-04-26 20:03:44.958818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.539 [2024-04-26 20:03:44.958845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.539 [2024-04-26 20:03:44.958911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.539 [2024-04-26 20:03:44.958925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.539 [2024-04-26 20:03:44.958978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.539 [2024-04-26 20:03:44.958991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.539 [2024-04-26 20:03:44.959045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.539 [2024-04-26 20:03:44.959057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.084 NEW_FUNC[1/671]: 0x487e70 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:08:01.084 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:01.084 #7 NEW cov: 11686 ft: 11670 corp: 2/31b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 5 CrossOver-ChangeByte-CopyPart-CopyPart-InsertRepeatedBytes- 00:08:01.084 [2024-04-26 20:03:45.289498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.289532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.289599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.289614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.289663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.289676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.289728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f151f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.289741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.084 #8 NEW cov: 11816 ft: 12243 corp: 3/61b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 ChangeBinInt- 00:08:01.084 [2024-04-26 20:03:45.339386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.339412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.339463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.339478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.339528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.339541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.084 #13 NEW cov: 11822 ft: 12826 corp: 4/84b lim: 35 exec/s: 0 rss: 69Mb L: 23/30 MS: 5 CopyPart-ChangeByte-CopyPart-ChangeBit-InsertRepeatedBytes- 00:08:01.084 [2024-04-26 20:03:45.379664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.379688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.379740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.379753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.379804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.379817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.379867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.379886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.084 #14 NEW cov: 11907 ft: 13105 corp: 5/114b lim: 35 exec/s: 0 rss: 69Mb L: 30/30 MS: 1 ChangeASCIIInt- 00:08:01.084 [2024-04-26 20:03:45.419782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.419806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.419857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f521f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.084 [2024-04-26 20:03:45.419877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.084 [2024-04-26 20:03:45.419927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.419940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.085 [2024-04-26 20:03:45.419989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.420002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.085 #15 NEW cov: 11907 ft: 13233 corp: 6/145b lim: 35 exec/s: 0 rss: 69Mb L: 31/31 MS: 1 InsertByte- 00:08:01.085 [2024-04-26 20:03:45.459700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.459724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.085 [2024-04-26 20:03:45.459775] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.459789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.085 [2024-04-26 20:03:45.459839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.459852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.085 #16 NEW cov: 11907 ft: 13319 corp: 7/168b lim: 35 exec/s: 0 rss: 70Mb L: 23/31 MS: 1 ChangeBinInt- 00:08:01.085 [2024-04-26 20:03:45.499849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.499877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.085 [2024-04-26 20:03:45.499945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.499959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.085 [2024-04-26 20:03:45.500012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.085 [2024-04-26 20:03:45.500025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.085 #17 NEW cov: 11907 ft: 13400 corp: 8/191b lim: 35 exec/s: 0 rss: 70Mb L: 23/31 MS: 1 ChangeByte- 00:08:01.343 [2024-04-26 20:03:45.539796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.539824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.539884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.539913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.343 #18 NEW cov: 11907 ft: 13697 corp: 9/210b lim: 35 exec/s: 0 rss: 70Mb L: 19/31 MS: 1 EraseBytes- 00:08:01.343 [2024-04-26 20:03:45.580225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.580250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.580299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f521f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.580313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.580362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.580375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.580425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2d1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.580437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.343 #19 NEW cov: 11907 ft: 13747 corp: 10/242b lim: 35 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 InsertByte- 00:08:01.343 [2024-04-26 20:03:45.620204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efabe28f cdw11:b1de0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.620228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.620281] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.620295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.620344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.620357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.343 #20 NEW cov: 11907 ft: 13854 corp: 11/265b lim: 35 exec/s: 0 rss: 70Mb L: 23/32 MS: 1 CMP- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.343 [2024-04-26 20:03:45.660434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.660458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.660511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f521f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.660525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.660575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.660591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.343 [2024-04-26 20:03:45.660640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:2d1f1f2c cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.343 [2024-04-26 20:03:45.660653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.343 #21 NEW cov: 11907 ft: 13891 corp: 12/297b lim: 35 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 ChangeByte- 00:08:01.344 [2024-04-26 20:03:45.700419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.700443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.344 [2024-04-26 20:03:45.700494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.700508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.344 [2024-04-26 20:03:45.700558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.700571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.344 #22 NEW cov: 11907 ft: 13911 corp: 13/320b lim: 35 exec/s: 0 rss: 70Mb L: 23/32 MS: 1 ShuffleBytes- 00:08:01.344 [2024-04-26 20:03:45.740537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efabe28f cdw11:b1f80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.740561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.344 [2024-04-26 20:03:45.740612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.740625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.344 [2024-04-26 20:03:45.740677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.344 [2024-04-26 20:03:45.740690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.344 #23 NEW cov: 11907 ft: 13924 corp: 14/343b lim: 35 exec/s: 0 rss: 70Mb L: 23/32 MS: 1 ChangeByte- 00:08:01.602 [2024-04-26 20:03:45.790520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.602 [2024-04-26 20:03:45.790545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.602 [2024-04-26 20:03:45.790598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e28f0000 cdw11:efab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.602 [2024-04-26 20:03:45.790612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.602 #24 NEW cov: 11907 ft: 13988 corp: 15/362b lim: 35 exec/s: 0 rss: 70Mb L: 19/32 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.602 [2024-04-26 20:03:45.830917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.602 [2024-04-26 20:03:45.830942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.602 [2024-04-26 20:03:45.830994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.602 [2024-04-26 20:03:45.831011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.602 [2024-04-26 20:03:45.831062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:abb18fef cdw11:de0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.602 [2024-04-26 20:03:45.831075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.831126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.831140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.603 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.603 #25 NEW cov: 11930 ft: 14035 corp: 16/392b lim: 35 exec/s: 0 rss: 70Mb L: 30/32 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.603 [2024-04-26 20:03:45.871034] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.871059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.871110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.871123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.871174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.871187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.871234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000151f cdw11:001f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.871247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.603 #26 NEW cov: 11930 ft: 14050 corp: 17/425b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:08:01.603 [2024-04-26 20:03:45.911025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.911049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.911101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.911115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.911165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.911178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.603 #27 NEW cov: 11930 ft: 14060 corp: 18/448b lim: 35 exec/s: 0 rss: 71Mb L: 23/33 MS: 1 ChangeBit- 00:08:01.603 [2024-04-26 20:03:45.951131] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efabe28f cdw11:b1f80000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.951155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.951211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f0000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.951224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.951273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.951287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.603 #28 NEW cov: 11930 ft: 14092 corp: 19/471b lim: 35 exec/s: 28 rss: 71Mb L: 23/33 MS: 1 ChangeByte- 00:08:01.603 [2024-04-26 20:03:45.991403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.991427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.991478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.991491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.991540] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.991553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:45.991601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:45.991614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.603 #29 NEW cov: 11930 ft: 14100 corp: 20/501b lim: 35 exec/s: 29 rss: 71Mb L: 30/33 MS: 1 CrossOver- 00:08:01.603 [2024-04-26 20:03:46.031363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:efabe28f cdw11:b1de0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:46.031387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:46.031456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00e20000 cdw11:8fef0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:46.031470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.603 [2024-04-26 20:03:46.031520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0a00b1de cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.603 [2024-04-26 20:03:46.031533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.862 #30 NEW cov: 11930 ft: 14118 corp: 21/524b lim: 35 exec/s: 30 rss: 71Mb L: 23/33 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.862 [2024-04-26 20:03:46.071617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.071643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.862 [2024-04-26 20:03:46.071694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e28f1f1f cdw11:efab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.071713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.862 [2024-04-26 20:03:46.071777] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00b1de0a cdw11:de0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.071791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.862 [2024-04-26 20:03:46.071841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.071855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.862 #31 NEW cov: 11930 ft: 14134 corp: 22/554b lim: 35 exec/s: 31 rss: 71Mb L: 30/33 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.862 [2024-04-26 20:03:46.111425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.111449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.862 [2024-04-26 20:03:46.111500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e2e20000 cdw11:8fef0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.111514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.862 #32 NEW cov: 11930 ft: 14160 corp: 23/574b lim: 35 exec/s: 32 rss: 71Mb L: 20/33 MS: 1 CopyPart- 00:08:01.862 [2024-04-26 20:03:46.151692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:e28fe28f cdw11:efab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.862 [2024-04-26 20:03:46.151716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.862 [2024-04-26 20:03:46.151768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00e2de0a cdw11:8fef0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.151781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.151831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0a00b1de cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.151843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.863 #33 NEW cov: 11930 ft: 14170 corp: 24/597b lim: 35 exec/s: 33 rss: 71Mb L: 23/33 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:01.863 [2024-04-26 20:03:46.191948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.191972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.192023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f5f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.192037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.192086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.192099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.192149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f151f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.192164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.863 #34 NEW cov: 11930 ft: 14227 corp: 25/627b lim: 35 exec/s: 34 rss: 72Mb L: 30/33 MS: 1 ChangeBit- 00:08:01.863 [2024-04-26 20:03:46.231936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.231960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.232012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.232025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.232071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.232084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.863 #35 NEW cov: 11930 ft: 14232 corp: 26/650b lim: 35 exec/s: 35 rss: 72Mb L: 23/33 MS: 1 ShuffleBytes- 00:08:01.863 [2024-04-26 20:03:46.272142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.272166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.272233] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.272248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.272297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.272310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.863 [2024-04-26 20:03:46.272361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f001f15 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.863 [2024-04-26 20:03:46.272374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.863 #36 NEW cov: 11930 ft: 14237 corp: 27/684b lim: 35 exec/s: 36 rss: 72Mb L: 34/34 MS: 1 CopyPart- 00:08:02.122 [2024-04-26 20:03:46.312002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:04fd040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.312025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.312094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.312107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 #37 NEW cov: 11930 ft: 14254 corp: 28/703b lim: 35 exec/s: 37 rss: 72Mb L: 19/34 MS: 1 ChangeBinInt- 00:08:02.122 [2024-04-26 20:03:46.352112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.352136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.352187] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.352204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 #38 NEW cov: 11930 ft: 14271 corp: 29/718b lim: 35 exec/s: 38 rss: 72Mb L: 15/34 MS: 1 EraseBytes- 00:08:02.122 [2024-04-26 20:03:46.392514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.392538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.392588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.392601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.392651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.392665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.392714] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000151f cdw11:001f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.392726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.122 #39 NEW cov: 11930 ft: 14343 corp: 30/751b lim: 35 exec/s: 39 rss: 72Mb L: 33/34 MS: 1 ChangeASCIIInt- 00:08:02.122 [2024-04-26 20:03:46.432833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.432857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.432913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.432927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.432978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.432991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.433041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.433054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.122 #40 NEW cov: 11930 ft: 14354 corp: 31/784b lim: 35 exec/s: 40 rss: 72Mb L: 33/34 MS: 1 CrossOver- 00:08:02.122 [2024-04-26 20:03:46.472899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.472925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.472977] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e2e21f1f cdw11:8fef0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.472991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.473041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:0a00b1de cdw11:de0a0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.473057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.473109] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.473123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.122 #41 NEW cov: 11930 ft: 14360 corp: 32/814b lim: 35 exec/s: 41 rss: 72Mb L: 30/34 MS: 1 PersAutoDict- DE: "\342\217\357\253\261\336\012\000"- 00:08:02.122 [2024-04-26 20:03:46.512920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00fb0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.512946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.512998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.513012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.513062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.513075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.122 #42 NEW cov: 11930 ft: 14369 corp: 33/837b lim: 35 exec/s: 42 rss: 72Mb L: 23/34 MS: 1 ChangeBinInt- 00:08:02.122 [2024-04-26 20:03:46.552837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.552861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.122 [2024-04-26 20:03:46.552918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:e2870000 cdw11:efab0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.122 [2024-04-26 20:03:46.552932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.381 #43 NEW cov: 11930 ft: 14374 corp: 34/856b lim: 35 exec/s: 43 rss: 72Mb L: 19/34 MS: 1 ChangeBit- 00:08:02.381 [2024-04-26 20:03:46.593246] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.381 [2024-04-26 20:03:46.593270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.381 [2024-04-26 20:03:46.593322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00170000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.381 [2024-04-26 20:03:46.593336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.381 [2024-04-26 20:03:46.593384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ff0003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.381 [2024-04-26 20:03:46.593398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.381 [2024-04-26 20:03:46.593449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.381 [2024-04-26 20:03:46.593461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.382 #44 NEW cov: 11930 ft: 14390 corp: 35/886b lim: 35 exec/s: 44 rss: 72Mb L: 30/34 MS: 1 InsertRepeatedBytes- 00:08:02.382 [2024-04-26 20:03:46.633061] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.633085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.633136] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.633149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.382 #45 NEW cov: 11930 ft: 14416 corp: 36/901b lim: 35 exec/s: 45 rss: 72Mb L: 15/34 MS: 1 EraseBytes- 00:08:02.382 [2024-04-26 20:03:46.673315] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0000e28f cdw11:00f00000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.673340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.673392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.673405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.673455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.673468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.382 #46 NEW cov: 11930 ft: 14417 corp: 37/924b lim: 35 exec/s: 46 rss: 72Mb L: 23/34 MS: 1 CopyPart- 00:08:02.382 [2024-04-26 20:03:46.713435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.713459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.713524] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f521f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.713538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.713588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.713601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.382 #47 NEW cov: 11930 ft: 14428 corp: 38/949b lim: 35 exec/s: 47 rss: 72Mb L: 25/34 MS: 1 EraseBytes- 00:08:02.382 [2024-04-26 20:03:46.753830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.753855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.753910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.753924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.753976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:1f1f1f1f cdw11:1f7e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.753988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.754041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:151f1f1f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.754054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.754107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:8 nsid:0 cdw10:1f1f1f1f cdw11:1f330000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.754120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:02.382 #48 NEW cov: 11930 ft: 14481 corp: 39/984b lim: 35 exec/s: 48 rss: 72Mb L: 35/35 MS: 1 InsertByte- 00:08:02.382 [2024-04-26 20:03:46.793838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.793864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.793920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:1f1f1f1f cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.793935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.793985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:001f0000 cdw11:1f1f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.793999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.382 [2024-04-26 20:03:46.794049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:31001f39 cdw11:001f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.382 [2024-04-26 20:03:46.794062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.382 #49 NEW cov: 11930 ft: 14490 corp: 40/1017b lim: 35 exec/s: 49 rss: 72Mb L: 33/35 MS: 1 CopyPart- 00:08:02.641 [2024-04-26 20:03:46.843795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.843819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.843889] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0400040a cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.843903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.843953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:efabe28f cdw11:b1de0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.843967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.641 #50 NEW cov: 11930 ft: 14499 corp: 41/1041b lim: 35 exec/s: 50 rss: 72Mb L: 24/35 MS: 1 CopyPart- 00:08:02.641 [2024-04-26 20:03:46.893934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0400040a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.893959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.894011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000005b cdw11:f0000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.894025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.894096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.894109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.641 #51 NEW cov: 11930 ft: 14509 corp: 42/1064b lim: 35 exec/s: 51 rss: 72Mb L: 23/35 MS: 1 ChangeByte- 00:08:02.641 [2024-04-26 20:03:46.934067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:0a0a0404 cdw11:04000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.934091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.934143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000fb00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.934156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.641 [2024-04-26 20:03:46.934205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.641 [2024-04-26 20:03:46.934218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.641 #52 NEW cov: 11930 ft: 14549 corp: 43/1088b lim: 35 exec/s: 26 rss: 72Mb L: 24/35 MS: 1 CrossOver- 00:08:02.641 #52 DONE cov: 11930 ft: 14549 corp: 43/1088b lim: 35 exec/s: 26 rss: 72Mb 00:08:02.641 ###### Recommended dictionary. ###### 00:08:02.641 "\342\217\357\253\261\336\012\000" # Uses: 6 00:08:02.641 ###### End of recommended dictionary. ###### 00:08:02.641 Done 52 runs in 2 second(s) 00:08:02.641 20:03:47 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_4.conf /var/tmp/suppress_nvmf_fuzz 00:08:02.900 20:03:47 -- ../common.sh@72 -- # (( i++ )) 00:08:02.900 20:03:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.900 20:03:47 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:02.900 20:03:47 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:08:02.900 20:03:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.900 20:03:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.900 20:03:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:02.900 20:03:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:08:02.900 20:03:47 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:02.900 20:03:47 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:02.900 20:03:47 -- nvmf/run.sh@34 -- # printf %02d 5 00:08:02.900 20:03:47 -- nvmf/run.sh@34 -- # port=4405 00:08:02.900 20:03:47 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:02.900 20:03:47 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:08:02.900 20:03:47 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.900 20:03:47 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:02.900 20:03:47 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:02.900 20:03:47 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 00:08:02.901 [2024-04-26 20:03:47.128885] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:02.901 [2024-04-26 20:03:47.128961] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1622831 ] 00:08:02.901 EAL: No free 2048 kB hugepages reported on node 1 00:08:03.159 [2024-04-26 20:03:47.445404] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:03.159 [2024-04-26 20:03:47.537824] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:03.159 [2024-04-26 20:03:47.596925] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:03.418 [2024-04-26 20:03:47.613101] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:08:03.418 INFO: Running with entropic power schedule (0xFF, 100). 00:08:03.418 INFO: Seed: 1174727882 00:08:03.418 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:03.418 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:03.418 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:08:03.418 INFO: A corpus is not provided, starting from an empty corpus 00:08:03.418 #2 INITED exec/s: 0 rss: 63Mb 00:08:03.418 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:03.418 This may also happen if the target rejected all inputs we tried so far 00:08:03.418 [2024-04-26 20:03:47.657870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff03ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.418 [2024-04-26 20:03:47.657909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.678 NEW_FUNC[1/671]: 0x48a000 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:08:03.678 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:03.678 #10 NEW cov: 11697 ft: 11689 corp: 2/10b lim: 45 exec/s: 0 rss: 69Mb L: 9/9 MS: 3 ChangeBinInt-CopyPart-CMP- DE: "\377\377\377\377\377\377\377>"- 00:08:03.678 [2024-04-26 20:03:47.998755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.678 [2024-04-26 20:03:47.998801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.678 #11 NEW cov: 11827 ft: 12218 corp: 3/19b lim: 45 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeBinInt- 00:08:03.678 [2024-04-26 20:03:48.068816] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.678 [2024-04-26 20:03:48.068853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.678 #12 NEW cov: 11833 ft: 12419 corp: 4/29b lim: 45 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 CrossOver- 00:08:03.937 [2024-04-26 20:03:48.128961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.937 [2024-04-26 20:03:48.128995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.937 #13 NEW cov: 11918 ft: 12711 corp: 5/43b lim: 45 exec/s: 0 rss: 69Mb L: 14/14 MS: 1 CrossOver- 00:08:03.937 [2024-04-26 20:03:48.199154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.937 [2024-04-26 20:03:48.199187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.937 #14 NEW cov: 11918 ft: 12838 corp: 6/57b lim: 45 exec/s: 0 rss: 70Mb L: 14/14 MS: 1 ChangeByte- 00:08:03.937 [2024-04-26 20:03:48.269321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.937 [2024-04-26 20:03:48.269352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.937 #15 NEW cov: 11918 ft: 12944 corp: 7/71b lim: 45 exec/s: 0 rss: 70Mb L: 14/14 MS: 1 ChangeBit- 00:08:03.937 [2024-04-26 20:03:48.339489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff0a03 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:03.937 [2024-04-26 20:03:48.339519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.937 #19 NEW cov: 11918 ft: 13028 corp: 8/84b lim: 45 exec/s: 0 rss: 70Mb L: 13/14 MS: 4 CopyPart-CopyPart-ShuffleBytes-CrossOver- 00:08:04.195 [2024-04-26 20:03:48.389598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:03ff0a0a cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.195 [2024-04-26 20:03:48.389627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.195 #20 NEW cov: 11918 ft: 13099 corp: 9/98b lim: 45 exec/s: 0 rss: 70Mb L: 14/14 MS: 1 CrossOver- 00:08:04.195 [2024-04-26 20:03:48.459774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.195 [2024-04-26 20:03:48.459804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.195 #21 NEW cov: 11918 ft: 13250 corp: 10/115b lim: 45 exec/s: 0 rss: 70Mb L: 17/17 MS: 1 CrossOver- 00:08:04.195 [2024-04-26 20:03:48.530013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:033e0a0a cdw11:ff3e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.195 [2024-04-26 20:03:48.530043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.195 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:04.195 #24 NEW cov: 11935 ft: 13375 corp: 11/124b lim: 45 exec/s: 0 rss: 70Mb L: 9/17 MS: 3 EraseBytes-ChangeBinInt-CopyPart- 00:08:04.195 [2024-04-26 20:03:48.600195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.195 [2024-04-26 20:03:48.600224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.195 #25 NEW cov: 11935 ft: 13425 corp: 12/134b lim: 45 exec/s: 0 rss: 70Mb L: 10/17 MS: 1 EraseBytes- 00:08:04.454 [2024-04-26 20:03:48.650322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:033e0a0a cdw11:ff3e0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.454 [2024-04-26 20:03:48.650352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.454 #26 NEW cov: 11935 ft: 13445 corp: 13/143b lim: 45 exec/s: 26 rss: 70Mb L: 9/17 MS: 1 ChangeBit- 00:08:04.454 [2024-04-26 20:03:48.720503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:030a0a0a cdw11:0a030001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.454 [2024-04-26 20:03:48.720532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.454 #28 NEW cov: 11935 ft: 13472 corp: 14/154b lim: 45 exec/s: 28 rss: 70Mb L: 11/17 MS: 2 EraseBytes-CopyPart- 00:08:04.454 [2024-04-26 20:03:48.770601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.454 [2024-04-26 20:03:48.770632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.454 #29 NEW cov: 11935 ft: 13563 corp: 15/164b lim: 45 exec/s: 29 rss: 70Mb L: 10/17 MS: 1 EraseBytes- 00:08:04.454 [2024-04-26 20:03:48.840818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:03ff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.454 [2024-04-26 20:03:48.840848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.454 [2024-04-26 20:03:48.840908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.454 [2024-04-26 20:03:48.840925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.454 #30 NEW cov: 11935 ft: 14297 corp: 16/187b lim: 45 exec/s: 30 rss: 71Mb L: 23/23 MS: 1 CrossOver- 00:08:04.712 [2024-04-26 20:03:48.901144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:48.901176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.712 [2024-04-26 20:03:48.901210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff03ff cdw11:ff3a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:48.901228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.712 [2024-04-26 20:03:48.901257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:3a3a3a3a cdw11:3a3a0001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:48.901275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.712 #31 NEW cov: 11935 ft: 14589 corp: 17/214b lim: 45 exec/s: 31 rss: 71Mb L: 27/27 MS: 1 InsertRepeatedBytes- 00:08:04.712 [2024-04-26 20:03:48.961138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:48.961168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.712 #32 NEW cov: 11935 ft: 14614 corp: 18/224b lim: 45 exec/s: 32 rss: 71Mb L: 10/27 MS: 1 ChangeByte- 00:08:04.712 [2024-04-26 20:03:49.031307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:030a0a0a cdw11:0a030001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:49.031338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.712 #33 NEW cov: 11935 ft: 14653 corp: 19/236b lim: 45 exec/s: 33 rss: 71Mb L: 12/27 MS: 1 InsertByte- 00:08:04.712 [2024-04-26 20:03:49.101535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:afaf0aaf cdw11:afaf0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:49.101565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.712 [2024-04-26 20:03:49.101613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:afafafaf cdw11:afaf0005 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.712 [2024-04-26 20:03:49.101629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.712 #34 NEW cov: 11935 ft: 14677 corp: 20/254b lim: 45 exec/s: 34 rss: 71Mb L: 18/27 MS: 1 InsertRepeatedBytes- 00:08:04.971 [2024-04-26 20:03:49.161600] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.161631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.971 #35 NEW cov: 11935 ft: 14700 corp: 21/264b lim: 45 exec/s: 35 rss: 71Mb L: 10/27 MS: 1 CopyPart- 00:08:04.971 [2024-04-26 20:03:49.211864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.211903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.971 [2024-04-26 20:03:49.211937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff03ff cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.211958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.971 #36 NEW cov: 11935 ft: 14718 corp: 22/287b lim: 45 exec/s: 36 rss: 71Mb L: 23/27 MS: 1 CopyPart- 00:08:04.971 [2024-04-26 20:03:49.261850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:ffff03ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.261887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.971 #37 NEW cov: 11935 ft: 14764 corp: 23/301b lim: 45 exec/s: 37 rss: 71Mb L: 14/27 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377>"- 00:08:04.971 [2024-04-26 20:03:49.312044] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.312076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.971 #38 NEW cov: 11935 ft: 14777 corp: 24/310b lim: 45 exec/s: 38 rss: 71Mb L: 9/27 MS: 1 EraseBytes- 00:08:04.971 [2024-04-26 20:03:49.382306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:fffff3ff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.382337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.971 [2024-04-26 20:03:49.382384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:04.971 [2024-04-26 20:03:49.382400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.230 #43 NEW cov: 11935 ft: 14791 corp: 25/329b lim: 45 exec/s: 43 rss: 72Mb L: 19/27 MS: 5 ChangeByte-ChangeBit-ShuffleBytes-CopyPart-InsertRepeatedBytes- 00:08:05.230 [2024-04-26 20:03:49.432502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.230 [2024-04-26 20:03:49.432532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.230 #44 NEW cov: 11935 ft: 14795 corp: 26/343b lim: 45 exec/s: 44 rss: 72Mb L: 14/27 MS: 1 ChangeByte- 00:08:05.230 [2024-04-26 20:03:49.482614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00240300 cdw11:ff090007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.230 [2024-04-26 20:03:49.482643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.230 #45 NEW cov: 11935 ft: 14804 corp: 27/352b lim: 45 exec/s: 45 rss: 72Mb L: 9/27 MS: 1 ShuffleBytes- 00:08:05.230 [2024-04-26 20:03:49.552817] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000324 cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.230 [2024-04-26 20:03:49.552846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.230 #46 NEW cov: 11942 ft: 14821 corp: 28/362b lim: 45 exec/s: 46 rss: 72Mb L: 10/27 MS: 1 CopyPart- 00:08:05.230 [2024-04-26 20:03:49.602943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000300 cdw11:09ff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:05.230 [2024-04-26 20:03:49.602972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.230 #47 NEW cov: 11942 ft: 14857 corp: 29/376b lim: 45 exec/s: 23 rss: 72Mb L: 14/27 MS: 1 ShuffleBytes- 00:08:05.230 #47 DONE cov: 11942 ft: 14857 corp: 29/376b lim: 45 exec/s: 23 rss: 72Mb 00:08:05.230 ###### Recommended dictionary. ###### 00:08:05.230 "\377\377\377\377\377\377\377>" # Uses: 1 00:08:05.230 ###### End of recommended dictionary. ###### 00:08:05.230 Done 47 runs in 2 second(s) 00:08:05.489 20:03:49 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_5.conf /var/tmp/suppress_nvmf_fuzz 00:08:05.489 20:03:49 -- ../common.sh@72 -- # (( i++ )) 00:08:05.489 20:03:49 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.489 20:03:49 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:05.489 20:03:49 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:08:05.489 20:03:49 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.489 20:03:49 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.489 20:03:49 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:05.489 20:03:49 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:08:05.489 20:03:49 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:05.489 20:03:49 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:05.489 20:03:49 -- nvmf/run.sh@34 -- # printf %02d 6 00:08:05.489 20:03:49 -- nvmf/run.sh@34 -- # port=4406 00:08:05.489 20:03:49 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:05.489 20:03:49 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:08:05.489 20:03:49 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.489 20:03:49 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:05.489 20:03:49 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:05.490 20:03:49 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 00:08:05.490 [2024-04-26 20:03:49.834398] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:05.490 [2024-04-26 20:03:49.834480] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623184 ] 00:08:05.490 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.748 [2024-04-26 20:03:50.146824] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:06.007 [2024-04-26 20:03:50.228975] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:06.007 [2024-04-26 20:03:50.288634] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:06.007 [2024-04-26 20:03:50.304840] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:08:06.007 INFO: Running with entropic power schedule (0xFF, 100). 00:08:06.007 INFO: Seed: 3865754878 00:08:06.007 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:06.007 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:06.007 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:08:06.007 INFO: A corpus is not provided, starting from an empty corpus 00:08:06.007 #2 INITED exec/s: 0 rss: 62Mb 00:08:06.007 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:06.007 This may also happen if the target rejected all inputs we tried so far 00:08:06.007 [2024-04-26 20:03:50.360359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.007 [2024-04-26 20:03:50.360389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.007 [2024-04-26 20:03:50.360440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.007 [2024-04-26 20:03:50.360454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.007 [2024-04-26 20:03:50.360503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d80a cdw11:00000000 00:08:06.007 [2024-04-26 20:03:50.360520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.265 NEW_FUNC[1/668]: 0x48c810 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:08:06.265 NEW_FUNC[2/668]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:06.265 #3 NEW cov: 11610 ft: 11611 corp: 2/7b lim: 10 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 InsertRepeatedBytes- 00:08:06.265 [2024-04-26 20:03:50.671210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.265 [2024-04-26 20:03:50.671259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.265 [2024-04-26 20:03:50.671331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.265 [2024-04-26 20:03:50.671350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.265 [2024-04-26 20:03:50.671411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df0a cdw11:00000000 00:08:06.265 [2024-04-26 20:03:50.671429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.265 NEW_FUNC[1/1]: 0x1794940 in nvme_tcp_qpair /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_tcp.c:171 00:08:06.265 #4 NEW cov: 11744 ft: 12289 corp: 3/13b lim: 10 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:06.524 [2024-04-26 20:03:50.721004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000422e cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.721042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 #7 NEW cov: 11750 ft: 12722 corp: 4/15b lim: 10 exec/s: 0 rss: 70Mb L: 2/6 MS: 3 ChangeBit-ChangeBit-InsertByte- 00:08:06.524 [2024-04-26 20:03:50.761071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.761095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 #8 NEW cov: 11835 ft: 13067 corp: 5/17b lim: 10 exec/s: 0 rss: 70Mb L: 2/6 MS: 1 ChangeBit- 00:08:06.524 [2024-04-26 20:03:50.801215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.801239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 #9 NEW cov: 11835 ft: 13105 corp: 6/19b lim: 10 exec/s: 0 rss: 70Mb L: 2/6 MS: 1 InsertByte- 00:08:06.524 [2024-04-26 20:03:50.841309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000422c cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.841334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 #10 NEW cov: 11835 ft: 13223 corp: 7/21b lim: 10 exec/s: 0 rss: 70Mb L: 2/6 MS: 1 ChangeBit- 00:08:06.524 [2024-04-26 20:03:50.881849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.881880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 [2024-04-26 20:03:50.881943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.881962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.524 [2024-04-26 20:03:50.882025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.882047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.524 [2024-04-26 20:03:50.882105] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.882119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.524 #12 NEW cov: 11835 ft: 13502 corp: 8/30b lim: 10 exec/s: 0 rss: 70Mb L: 9/9 MS: 2 EraseBytes-CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:06.524 [2024-04-26 20:03:50.921547] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000042d8 cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.921572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.524 #13 NEW cov: 11835 ft: 13626 corp: 9/33b lim: 10 exec/s: 0 rss: 70Mb L: 3/9 MS: 1 CrossOver- 00:08:06.524 [2024-04-26 20:03:50.961708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:06.524 [2024-04-26 20:03:50.961734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 #14 NEW cov: 11835 ft: 13689 corp: 10/35b lim: 10 exec/s: 0 rss: 70Mb L: 2/9 MS: 1 ShuffleBytes- 00:08:06.783 [2024-04-26 20:03:51.001792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000462c cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.001816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 #15 NEW cov: 11835 ft: 13794 corp: 11/37b lim: 10 exec/s: 0 rss: 70Mb L: 2/9 MS: 1 ChangeBit- 00:08:06.783 [2024-04-26 20:03:51.041901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.041925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 #22 NEW cov: 11835 ft: 13808 corp: 12/39b lim: 10 exec/s: 0 rss: 70Mb L: 2/9 MS: 2 EraseBytes-CrossOver- 00:08:06.783 [2024-04-26 20:03:51.082078] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.082103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 #23 NEW cov: 11835 ft: 13859 corp: 13/42b lim: 10 exec/s: 0 rss: 70Mb L: 3/9 MS: 1 InsertByte- 00:08:06.783 [2024-04-26 20:03:51.122399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.122423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 [2024-04-26 20:03:51.122491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d8d8 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.122504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.783 [2024-04-26 20:03:51.122557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000df0a cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.122570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.783 #24 NEW cov: 11835 ft: 13876 corp: 14/48b lim: 10 exec/s: 0 rss: 70Mb L: 6/9 MS: 1 ShuffleBytes- 00:08:06.783 [2024-04-26 20:03:51.162265] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.162290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 #25 NEW cov: 11835 ft: 13883 corp: 15/51b lim: 10 exec/s: 0 rss: 70Mb L: 3/9 MS: 1 CrossOver- 00:08:06.783 [2024-04-26 20:03:51.202748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.202771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:06.783 [2024-04-26 20:03:51.202840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.202853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:06.783 [2024-04-26 20:03:51.202926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.202941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:06.783 [2024-04-26 20:03:51.202993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:06.783 [2024-04-26 20:03:51.203007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:06.783 #26 NEW cov: 11835 ft: 13887 corp: 16/60b lim: 10 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:07.043 [2024-04-26 20:03:51.242964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.242988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.243056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000deb4 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.243069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.243122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000abf6 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.243135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.243189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008fe6 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.243202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.243254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.243267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.043 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:07.043 #27 NEW cov: 11858 ft: 13969 corp: 17/70b lim: 10 exec/s: 0 rss: 70Mb L: 10/10 MS: 1 CMP- DE: "\000\012\336\264\253\366\217\346"- 00:08:07.043 [2024-04-26 20:03:51.282633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000042d8 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.282658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 #28 NEW cov: 11858 ft: 13989 corp: 18/73b lim: 10 exec/s: 0 rss: 71Mb L: 3/10 MS: 1 CrossOver- 00:08:07.043 [2024-04-26 20:03:51.323010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.323034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.323088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.323104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.323156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.323169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.043 #29 NEW cov: 11858 ft: 13991 corp: 19/79b lim: 10 exec/s: 29 rss: 71Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:08:07.043 [2024-04-26 20:03:51.353381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.353406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.353459] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000deb4 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.353473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.353523] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000abf6 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.353537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.353591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008fe6 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.353605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.353657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000462c cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.353670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.043 #30 NEW cov: 11858 ft: 14064 corp: 20/89b lim: 10 exec/s: 30 rss: 71Mb L: 10/10 MS: 1 PersAutoDict- DE: "\000\012\336\264\253\366\217\346"- 00:08:07.043 [2024-04-26 20:03:51.393296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.393320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.393389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.393403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.393454] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002222 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.393466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.393517] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002222 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.393530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.043 #31 NEW cov: 11858 ft: 14094 corp: 21/98b lim: 10 exec/s: 31 rss: 71Mb L: 9/10 MS: 1 InsertRepeatedBytes- 00:08:07.043 [2024-04-26 20:03:51.433081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000043d8 cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.433105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 #32 NEW cov: 11858 ft: 14113 corp: 22/101b lim: 10 exec/s: 32 rss: 71Mb L: 3/10 MS: 1 ChangeBit- 00:08:07.043 [2024-04-26 20:03:51.473368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.473395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.043 [2024-04-26 20:03:51.473447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:07.043 [2024-04-26 20:03:51.473461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.302 #33 NEW cov: 11858 ft: 14352 corp: 23/105b lim: 10 exec/s: 33 rss: 71Mb L: 4/10 MS: 1 CopyPart- 00:08:07.302 [2024-04-26 20:03:51.513453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2e cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.513478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.513531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002e2c cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.513545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.302 #34 NEW cov: 11858 ft: 14373 corp: 24/109b lim: 10 exec/s: 34 rss: 71Mb L: 4/10 MS: 1 CopyPart- 00:08:07.302 [2024-04-26 20:03:51.553430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002e2c cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.553454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.302 #35 NEW cov: 11858 ft: 14402 corp: 25/111b lim: 10 exec/s: 35 rss: 71Mb L: 2/10 MS: 1 EraseBytes- 00:08:07.302 [2024-04-26 20:03:51.593943] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.593968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.594024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.594038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.594090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002242 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.594104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.594156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002222 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.594169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.302 #36 NEW cov: 11858 ft: 14448 corp: 26/120b lim: 10 exec/s: 36 rss: 72Mb L: 9/10 MS: 1 ChangeByte- 00:08:07.302 [2024-04-26 20:03:51.633918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d2d2 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.633942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.634008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d2d2 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.634021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.634073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d22e cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.634086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.302 #37 NEW cov: 11858 ft: 14452 corp: 27/127b lim: 10 exec/s: 37 rss: 72Mb L: 7/10 MS: 1 InsertRepeatedBytes- 00:08:07.302 [2024-04-26 20:03:51.674191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000000a cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.674215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.674269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000deb4 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.674282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.302 [2024-04-26 20:03:51.674335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000abf6 cdw11:00000000 00:08:07.302 [2024-04-26 20:03:51.674349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.303 [2024-04-26 20:03:51.674403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00008fe6 cdw11:00000000 00:08:07.303 [2024-04-26 20:03:51.674416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.303 #38 NEW cov: 11858 ft: 14457 corp: 28/136b lim: 10 exec/s: 38 rss: 72Mb L: 9/10 MS: 1 PersAutoDict- DE: "\000\012\336\264\253\366\217\346"- 00:08:07.303 [2024-04-26 20:03:51.703942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004ab7 cdw11:00000000 00:08:07.303 [2024-04-26 20:03:51.703966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.303 #39 NEW cov: 11858 ft: 14464 corp: 29/138b lim: 10 exec/s: 39 rss: 72Mb L: 2/10 MS: 1 ChangeByte- 00:08:07.303 [2024-04-26 20:03:51.744030] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000422e cdw11:00000000 00:08:07.303 [2024-04-26 20:03:51.744055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 #40 NEW cov: 11858 ft: 14508 corp: 30/140b lim: 10 exec/s: 40 rss: 72Mb L: 2/10 MS: 1 ShuffleBytes- 00:08:07.561 [2024-04-26 20:03:51.784301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ab6 cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.784326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 [2024-04-26 20:03:51.784382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.784395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.561 #41 NEW cov: 11858 ft: 14521 corp: 31/144b lim: 10 exec/s: 41 rss: 72Mb L: 4/10 MS: 1 ChangeByte- 00:08:07.561 [2024-04-26 20:03:51.824251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000042dc cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.824274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 #42 NEW cov: 11858 ft: 14529 corp: 32/146b lim: 10 exec/s: 42 rss: 72Mb L: 2/10 MS: 1 ChangeByte- 00:08:07.561 [2024-04-26 20:03:51.864348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.864374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 #43 NEW cov: 11858 ft: 14544 corp: 33/148b lim: 10 exec/s: 43 rss: 72Mb L: 2/10 MS: 1 CrossOver- 00:08:07.561 [2024-04-26 20:03:51.904836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.904863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 [2024-04-26 20:03:51.904944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009800 cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.904967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.561 [2024-04-26 20:03:51.905021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.905034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.561 [2024-04-26 20:03:51.905087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.905100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.561 #44 NEW cov: 11858 ft: 14581 corp: 34/157b lim: 10 exec/s: 44 rss: 72Mb L: 9/10 MS: 1 ChangeByte- 00:08:07.561 [2024-04-26 20:03:51.944658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000002c cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.944682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.561 #45 NEW cov: 11858 ft: 14650 corp: 35/159b lim: 10 exec/s: 45 rss: 72Mb L: 2/10 MS: 1 CrossOver- 00:08:07.561 [2024-04-26 20:03:51.984787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002c0a cdw11:00000000 00:08:07.561 [2024-04-26 20:03:51.984814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 #46 NEW cov: 11858 ft: 14653 corp: 36/161b lim: 10 exec/s: 46 rss: 72Mb L: 2/10 MS: 1 CrossOver- 00:08:07.820 [2024-04-26 20:03:52.024849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002c00 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.024880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 #47 NEW cov: 11858 ft: 14660 corp: 37/164b lim: 10 exec/s: 47 rss: 72Mb L: 3/10 MS: 1 InsertByte- 00:08:07.820 [2024-04-26 20:03:52.065457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.065482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.065534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009800 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.065548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.065599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.065612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.065663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002d00 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.065675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.065726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.065739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:07.820 #48 NEW cov: 11858 ft: 14682 corp: 38/174b lim: 10 exec/s: 48 rss: 72Mb L: 10/10 MS: 1 InsertByte- 00:08:07.820 [2024-04-26 20:03:52.105313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004aff cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.105338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.105408] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.105422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.105476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff2e cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.105489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.820 #49 NEW cov: 11858 ft: 14699 corp: 39/180b lim: 10 exec/s: 49 rss: 72Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:08:07.820 [2024-04-26 20:03:52.145290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a12 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.145315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.145384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a8e cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.145398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.820 #50 NEW cov: 11858 ft: 14702 corp: 40/184b lim: 10 exec/s: 50 rss: 72Mb L: 4/10 MS: 1 ChangeByte- 00:08:07.820 [2024-04-26 20:03:52.185301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004c42 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.185325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 #51 NEW cov: 11858 ft: 14708 corp: 41/187b lim: 10 exec/s: 51 rss: 72Mb L: 3/10 MS: 1 InsertByte- 00:08:07.820 [2024-04-26 20:03:52.215497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.215521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.215591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b68e cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.215604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.820 #52 NEW cov: 11858 ft: 14760 corp: 42/191b lim: 10 exec/s: 52 rss: 72Mb L: 4/10 MS: 1 ShuffleBytes- 00:08:07.820 [2024-04-26 20:03:52.255841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004222 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.255865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.255940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d2d2 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.255955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.256007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000d2d2 cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.256021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:07.820 [2024-04-26 20:03:52.256074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000d22e cdw11:00000000 00:08:07.820 [2024-04-26 20:03:52.256087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.080 #53 NEW cov: 11858 ft: 14772 corp: 43/200b lim: 10 exec/s: 53 rss: 72Mb L: 9/10 MS: 1 CrossOver- 00:08:08.080 [2024-04-26 20:03:52.295975] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004a2e cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.296006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 [2024-04-26 20:03:52.296059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002222 cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.296071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.080 [2024-04-26 20:03:52.296140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00002243 cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.296154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.080 [2024-04-26 20:03:52.296208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00002222 cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.296221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:08.080 #54 NEW cov: 11858 ft: 14775 corp: 44/209b lim: 10 exec/s: 54 rss: 73Mb L: 9/10 MS: 1 ChangeBit- 00:08:08.080 [2024-04-26 20:03:52.335937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.335961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.080 [2024-04-26 20:03:52.336029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000f5ff cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.336042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.080 [2024-04-26 20:03:52.336094] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ff0a cdw11:00000000 00:08:08.080 [2024-04-26 20:03:52.336107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.080 #55 NEW cov: 11858 ft: 14783 corp: 45/215b lim: 10 exec/s: 27 rss: 73Mb L: 6/10 MS: 1 ChangeBinInt- 00:08:08.080 #55 DONE cov: 11858 ft: 14783 corp: 45/215b lim: 10 exec/s: 27 rss: 73Mb 00:08:08.080 ###### Recommended dictionary. ###### 00:08:08.080 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:08.080 "\000\012\336\264\253\366\217\346" # Uses: 2 00:08:08.080 ###### End of recommended dictionary. ###### 00:08:08.080 Done 55 runs in 2 second(s) 00:08:08.080 20:03:52 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_6.conf /var/tmp/suppress_nvmf_fuzz 00:08:08.080 20:03:52 -- ../common.sh@72 -- # (( i++ )) 00:08:08.080 20:03:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:08.080 20:03:52 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:08:08.080 20:03:52 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:08:08.080 20:03:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:08.080 20:03:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:08.080 20:03:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.080 20:03:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:08:08.080 20:03:52 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:08.080 20:03:52 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:08.080 20:03:52 -- nvmf/run.sh@34 -- # printf %02d 7 00:08:08.080 20:03:52 -- nvmf/run.sh@34 -- # port=4407 00:08:08.080 20:03:52 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.080 20:03:52 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:08:08.080 20:03:52 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:08.080 20:03:52 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:08.080 20:03:52 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:08.080 20:03:52 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 00:08:08.339 [2024-04-26 20:03:52.545690] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:08.339 [2024-04-26 20:03:52.545770] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623546 ] 00:08:08.339 EAL: No free 2048 kB hugepages reported on node 1 00:08:08.597 [2024-04-26 20:03:52.869433] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:08.598 [2024-04-26 20:03:52.962701] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.598 [2024-04-26 20:03:53.022148] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.598 [2024-04-26 20:03:53.038370] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:08:08.856 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.856 INFO: Seed: 2304775934 00:08:08.856 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:08.856 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:08.856 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:08:08.856 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.856 #2 INITED exec/s: 0 rss: 63Mb 00:08:08.856 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.856 This may also happen if the target rejected all inputs we tried so far 00:08:08.856 [2024-04-26 20:03:53.116133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:08.856 [2024-04-26 20:03:53.116177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:08.856 [2024-04-26 20:03:53.116297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.856 [2024-04-26 20:03:53.116317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:08.856 [2024-04-26 20:03:53.116431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.856 [2024-04-26 20:03:53.116449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:08.856 [2024-04-26 20:03:53.116552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:08.856 [2024-04-26 20:03:53.116571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.121 NEW_FUNC[1/669]: 0x48d200 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:08:09.121 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:09.121 #3 NEW cov: 11614 ft: 11615 corp: 2/10b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:09.121 [2024-04-26 20:03:53.476963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a01 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.477006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.477097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.477114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.477204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.477225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.477310] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.477325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.121 #5 NEW cov: 11744 ft: 12138 corp: 3/19b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 2 ChangeBit-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:09.121 [2024-04-26 20:03:53.527015] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a2a cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.527045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.527134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000062f6 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.527151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.527236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000022b6 cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.527249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.121 [2024-04-26 20:03:53.527334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000de0a cdw11:00000000 00:08:09.121 [2024-04-26 20:03:53.527349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.121 #6 NEW cov: 11750 ft: 12330 corp: 4/28b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 CMP- DE: "*b\366\"\266\336\012\000"- 00:08:09.379 [2024-04-26 20:03:53.586452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ef0a cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.586480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 #8 NEW cov: 11835 ft: 12860 corp: 5/30b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 2 ChangeByte-CrossOver- 00:08:09.379 [2024-04-26 20:03:53.637377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.637403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.637490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.637506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.637589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.637605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.637692] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.637708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.379 #9 NEW cov: 11835 ft: 13035 corp: 6/39b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ChangeBit- 00:08:09.379 [2024-04-26 20:03:53.697213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.697238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.697323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.697342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.379 #10 NEW cov: 11835 ft: 13311 corp: 7/44b lim: 10 exec/s: 0 rss: 69Mb L: 5/9 MS: 1 EraseBytes- 00:08:09.379 [2024-04-26 20:03:53.757852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.757880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.757967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.757982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.758079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.758095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.758178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.758193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.379 #13 NEW cov: 11835 ft: 13415 corp: 8/53b lim: 10 exec/s: 0 rss: 70Mb L: 9/9 MS: 3 EraseBytes-CrossOver-PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:09.379 [2024-04-26 20:03:53.817664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.817689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.379 [2024-04-26 20:03:53.817781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.379 [2024-04-26 20:03:53.817797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.639 #14 NEW cov: 11835 ft: 13457 corp: 9/57b lim: 10 exec/s: 0 rss: 70Mb L: 4/9 MS: 1 EraseBytes- 00:08:09.639 [2024-04-26 20:03:53.878491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.878516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.878598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.878614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.878687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.878702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.878786] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.878802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.639 #15 NEW cov: 11835 ft: 13505 corp: 10/66b lim: 10 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 ShuffleBytes- 00:08:09.639 [2024-04-26 20:03:53.928683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.928706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.928789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.928805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.928886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.928913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:53.928998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.929013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.639 #16 NEW cov: 11835 ft: 13528 corp: 11/75b lim: 10 exec/s: 0 rss: 70Mb L: 9/9 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\006"- 00:08:09.639 [2024-04-26 20:03:53.988082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ef0a cdw11:00000000 00:08:09.639 [2024-04-26 20:03:53.988107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.639 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:09.639 #17 NEW cov: 11858 ft: 13676 corp: 12/77b lim: 10 exec/s: 0 rss: 70Mb L: 2/9 MS: 1 CopyPart- 00:08:09.639 [2024-04-26 20:03:54.039350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000003a cdw11:00000000 00:08:09.639 [2024-04-26 20:03:54.039376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:54.039460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:54.039476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:54.039562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:54.039577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:54.039642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.639 [2024-04-26 20:03:54.039657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.639 [2024-04-26 20:03:54.039715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:09.639 [2024-04-26 20:03:54.039732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:09.639 #18 NEW cov: 11858 ft: 13746 corp: 13/87b lim: 10 exec/s: 0 rss: 70Mb L: 10/10 MS: 1 InsertByte- 00:08:09.968 [2024-04-26 20:03:54.099596] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.099622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.099703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.099720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.099805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.099820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.099904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000600 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.099922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.968 #19 NEW cov: 11858 ft: 13773 corp: 14/95b lim: 10 exec/s: 19 rss: 70Mb L: 8/10 MS: 1 EraseBytes- 00:08:09.968 [2024-04-26 20:03:54.159723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.159747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.159825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.159841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.159938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.159953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.160041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.160056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.968 #20 NEW cov: 11858 ft: 13806 corp: 15/104b lim: 10 exec/s: 20 rss: 70Mb L: 9/10 MS: 1 ChangeBit- 00:08:09.968 [2024-04-26 20:03:54.209976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.210000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.210080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00001a01 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.210095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.210174] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.210189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.210268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.210285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.968 #21 NEW cov: 11858 ft: 13835 corp: 16/113b lim: 10 exec/s: 21 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:08:09.968 [2024-04-26 20:03:54.260275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.260300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.260385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.260400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.260482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.260498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.260578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.260596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.968 #22 NEW cov: 11858 ft: 13852 corp: 17/121b lim: 10 exec/s: 22 rss: 70Mb L: 8/10 MS: 1 ChangeByte- 00:08:09.968 [2024-04-26 20:03:54.320449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a2a cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.320472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.320557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:000062f6 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.320573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.320657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:000022b6 cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.320674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:09.968 [2024-04-26 20:03:54.320753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000de0a cdw11:00000000 00:08:09.968 [2024-04-26 20:03:54.320769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:09.968 #23 NEW cov: 11858 ft: 13935 corp: 18/130b lim: 10 exec/s: 23 rss: 71Mb L: 9/10 MS: 1 ChangeBit- 00:08:10.227 [2024-04-26 20:03:54.380364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000400 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.380390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.380477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.380493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.227 #24 NEW cov: 11858 ft: 13983 corp: 19/134b lim: 10 exec/s: 24 rss: 71Mb L: 4/10 MS: 1 ChangeBinInt- 00:08:10.227 [2024-04-26 20:03:54.440228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000efef cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.440256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 #25 NEW cov: 11858 ft: 14013 corp: 20/137b lim: 10 exec/s: 25 rss: 71Mb L: 3/10 MS: 1 CopyPart- 00:08:10.227 [2024-04-26 20:03:54.490984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.491010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.491100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.491118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.491210] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ef0a cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.491226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.227 #26 NEW cov: 11858 ft: 14160 corp: 21/143b lim: 10 exec/s: 26 rss: 71Mb L: 6/10 MS: 1 InsertRepeatedBytes- 00:08:10.227 [2024-04-26 20:03:54.551377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dda9 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.551403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.551488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.551504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.551587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000a9ef cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.551605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.227 #27 NEW cov: 11858 ft: 14173 corp: 22/150b lim: 10 exec/s: 27 rss: 71Mb L: 7/10 MS: 1 InsertByte- 00:08:10.227 [2024-04-26 20:03:54.612320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.612348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.612435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.612451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.612534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.612550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.612620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000028 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.612635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.612703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.612718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.227 #28 NEW cov: 11858 ft: 14185 corp: 23/160b lim: 10 exec/s: 28 rss: 71Mb L: 10/10 MS: 1 InsertByte- 00:08:10.227 [2024-04-26 20:03:54.662303] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.662330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.662422] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.662440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.662526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.662541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.227 [2024-04-26 20:03:54.662624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000107 cdw11:00000000 00:08:10.227 [2024-04-26 20:03:54.662641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.485 #29 NEW cov: 11858 ft: 14192 corp: 24/169b lim: 10 exec/s: 29 rss: 71Mb L: 9/10 MS: 1 ChangeBinInt- 00:08:10.485 [2024-04-26 20:03:54.712282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.712309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.712397] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.712416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.712489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ef25 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.712503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.485 #30 NEW cov: 11858 ft: 14215 corp: 25/176b lim: 10 exec/s: 30 rss: 71Mb L: 7/10 MS: 1 InsertByte- 00:08:10.485 [2024-04-26 20:03:54.762784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001a01 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.762810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.762921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.762939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.763019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00004000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.763037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.763123] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.763141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.485 #31 NEW cov: 11858 ft: 14224 corp: 26/185b lim: 10 exec/s: 31 rss: 71Mb L: 9/10 MS: 1 ChangeBit- 00:08:10.485 [2024-04-26 20:03:54.812679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000dda9 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.812707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.812787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a9ef cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.812804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.485 #32 NEW cov: 11858 ft: 14303 corp: 27/190b lim: 10 exec/s: 32 rss: 72Mb L: 5/10 MS: 1 EraseBytes- 00:08:10.485 [2024-04-26 20:03:54.872936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.872962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.873052] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000a9a9 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.873067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.485 #35 NEW cov: 11858 ft: 14326 corp: 28/195b lim: 10 exec/s: 35 rss: 72Mb L: 5/10 MS: 3 CrossOver-ShuffleBytes-CrossOver- 00:08:10.485 [2024-04-26 20:03:54.923905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.923929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.924017] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.924033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.924122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.924138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.924225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.924242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.485 [2024-04-26 20:03:54.924338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.485 [2024-04-26 20:03:54.924354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.744 #36 NEW cov: 11858 ft: 14341 corp: 29/205b lim: 10 exec/s: 36 rss: 72Mb L: 10/10 MS: 1 CopyPart- 00:08:10.744 [2024-04-26 20:03:54.984018] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:54.984042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:54.984122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:54.984139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:54.984224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:54.984240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:54.984319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000600 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:54.984335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.744 #37 NEW cov: 11858 ft: 14350 corp: 30/214b lim: 10 exec/s: 37 rss: 72Mb L: 9/10 MS: 1 CopyPart- 00:08:10.744 [2024-04-26 20:03:55.034278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.034309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.034399] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.034415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.034494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.034511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.034595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.034611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.744 #38 NEW cov: 11858 ft: 14378 corp: 31/223b lim: 10 exec/s: 38 rss: 72Mb L: 9/10 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\006"- 00:08:10.744 [2024-04-26 20:03:55.094824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.094849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.094940] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.094961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.095056] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.095071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.095157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.095173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:10.744 [2024-04-26 20:03:55.095256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000000a cdw11:00000000 00:08:10.744 [2024-04-26 20:03:55.095272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:10.744 #39 NEW cov: 11858 ft: 14385 corp: 32/233b lim: 10 exec/s: 19 rss: 72Mb L: 10/10 MS: 1 ChangeBinInt- 00:08:10.744 #39 DONE cov: 11858 ft: 14385 corp: 32/233b lim: 10 exec/s: 19 rss: 72Mb 00:08:10.744 ###### Recommended dictionary. ###### 00:08:10.744 "\001\000\000\000\000\000\000\000" # Uses: 2 00:08:10.744 "*b\366\"\266\336\012\000" # Uses: 0 00:08:10.744 "\000\000\000\000\000\000\000\006" # Uses: 1 00:08:10.744 ###### End of recommended dictionary. ###### 00:08:10.744 Done 39 runs in 2 second(s) 00:08:11.003 20:03:55 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_7.conf /var/tmp/suppress_nvmf_fuzz 00:08:11.003 20:03:55 -- ../common.sh@72 -- # (( i++ )) 00:08:11.003 20:03:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:11.003 20:03:55 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:08:11.003 20:03:55 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:08:11.003 20:03:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:11.003 20:03:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:11.003 20:03:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.003 20:03:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:08:11.003 20:03:55 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:11.003 20:03:55 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:11.003 20:03:55 -- nvmf/run.sh@34 -- # printf %02d 8 00:08:11.003 20:03:55 -- nvmf/run.sh@34 -- # port=4408 00:08:11.003 20:03:55 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.003 20:03:55 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:08:11.003 20:03:55 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:11.003 20:03:55 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:11.003 20:03:55 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:11.003 20:03:55 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 00:08:11.003 [2024-04-26 20:03:55.300448] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:11.003 [2024-04-26 20:03:55.300519] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623908 ] 00:08:11.003 EAL: No free 2048 kB hugepages reported on node 1 00:08:11.262 [2024-04-26 20:03:55.622725] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:11.521 [2024-04-26 20:03:55.716512] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:11.521 [2024-04-26 20:03:55.775794] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:11.521 [2024-04-26 20:03:55.791997] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:08:11.521 INFO: Running with entropic power schedule (0xFF, 100). 00:08:11.521 INFO: Seed: 762795629 00:08:11.521 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:11.521 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:11.521 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:08:11.521 INFO: A corpus is not provided, starting from an empty corpus 00:08:11.521 [2024-04-26 20:03:55.836740] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.836772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.521 #2 INITED cov: 11640 ft: 11639 corp: 1/1b exec/s: 0 rss: 68Mb 00:08:11.521 [2024-04-26 20:03:55.886908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.886937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.521 [2024-04-26 20:03:55.886986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.887002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.521 [2024-04-26 20:03:55.887032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.887050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.521 [2024-04-26 20:03:55.887079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.887094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:11.521 #3 NEW cov: 11772 ft: 12927 corp: 2/5b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:11.521 [2024-04-26 20:03:55.956944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.521 [2024-04-26 20:03:55.956976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.779 #4 NEW cov: 11778 ft: 13173 corp: 3/6b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ShuffleBytes- 00:08:11.779 [2024-04-26 20:03:56.017180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.017210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.017259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.017275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.017305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.017321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.779 #5 NEW cov: 11863 ft: 13613 corp: 4/9b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 EraseBytes- 00:08:11.779 [2024-04-26 20:03:56.087367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.087398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.087448] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.087464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.087493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.087510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.779 #6 NEW cov: 11863 ft: 13743 corp: 5/12b lim: 5 exec/s: 0 rss: 69Mb L: 3/4 MS: 1 CopyPart- 00:08:11.779 [2024-04-26 20:03:56.157595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.157625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.157659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.157675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:11.779 [2024-04-26 20:03:56.157705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.157721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:11.779 #7 NEW cov: 11863 ft: 13794 corp: 6/15b lim: 5 exec/s: 0 rss: 69Mb L: 3/4 MS: 1 ShuffleBytes- 00:08:11.779 [2024-04-26 20:03:56.207589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:11.779 [2024-04-26 20:03:56.207620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.037 #8 NEW cov: 11863 ft: 13855 corp: 7/16b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 CopyPart- 00:08:12.037 [2024-04-26 20:03:56.257904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.257935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.257969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.257985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.258014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.258030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.258059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.258074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.038 #9 NEW cov: 11863 ft: 13900 corp: 8/20b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 ChangeByte- 00:08:12.038 [2024-04-26 20:03:56.308089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.308117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.308166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.308182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.308212] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.308227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.308257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.308272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.038 #10 NEW cov: 11863 ft: 13981 corp: 9/24b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 CrossOver- 00:08:12.038 [2024-04-26 20:03:56.378184] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.378212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.378261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.378277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.378307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.378322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.378351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.378366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.038 #11 NEW cov: 11863 ft: 14031 corp: 10/28b lim: 5 exec/s: 0 rss: 69Mb L: 4/4 MS: 1 ChangeByte- 00:08:12.038 [2024-04-26 20:03:56.428402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.428431] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.428480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.428495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.428525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.428540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.428574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.428589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.038 [2024-04-26 20:03:56.428618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.038 [2024-04-26 20:03:56.428634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.038 #12 NEW cov: 11863 ft: 14117 corp: 11/33b lim: 5 exec/s: 0 rss: 69Mb L: 5/5 MS: 1 InsertByte- 00:08:12.296 [2024-04-26 20:03:56.498307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.498336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.296 #13 NEW cov: 11863 ft: 14157 corp: 12/34b lim: 5 exec/s: 0 rss: 69Mb L: 1/5 MS: 1 CopyPart- 00:08:12.296 [2024-04-26 20:03:56.558583] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.558614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.296 [2024-04-26 20:03:56.558662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.558678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.296 #14 NEW cov: 11863 ft: 14343 corp: 13/36b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 InsertByte- 00:08:12.296 [2024-04-26 20:03:56.618799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.618828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.296 [2024-04-26 20:03:56.618885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.618902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.296 [2024-04-26 20:03:56.618933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.296 [2024-04-26 20:03:56.618949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.296 [2024-04-26 20:03:56.618987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.297 [2024-04-26 20:03:56.619002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.297 #15 NEW cov: 11863 ft: 14353 corp: 14/40b lim: 5 exec/s: 0 rss: 69Mb L: 4/5 MS: 1 ChangeBinInt- 00:08:12.297 [2024-04-26 20:03:56.668973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.297 [2024-04-26 20:03:56.669002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.297 [2024-04-26 20:03:56.669051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.297 [2024-04-26 20:03:56.669070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.297 [2024-04-26 20:03:56.669100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.297 [2024-04-26 20:03:56.669116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.297 [2024-04-26 20:03:56.669145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.297 [2024-04-26 20:03:56.669161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.863 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:12.863 #16 NEW cov: 11886 ft: 14392 corp: 15/44b lim: 5 exec/s: 16 rss: 70Mb L: 4/5 MS: 1 InsertByte- 00:08:12.863 [2024-04-26 20:03:57.033452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.033492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.033589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.033605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.033693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.033708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.033799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.033815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.033923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.033940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.863 #17 NEW cov: 11886 ft: 14465 corp: 16/49b lim: 5 exec/s: 17 rss: 70Mb L: 5/5 MS: 1 InsertByte- 00:08:12.863 [2024-04-26 20:03:57.083338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.083370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.083472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.083488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.083575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.083592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.083686] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.083701] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.863 #18 NEW cov: 11886 ft: 14515 corp: 17/53b lim: 5 exec/s: 18 rss: 70Mb L: 4/5 MS: 1 ChangeByte- 00:08:12.863 [2024-04-26 20:03:57.132860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.132892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.132994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.133010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.863 #19 NEW cov: 11886 ft: 14539 corp: 18/55b lim: 5 exec/s: 19 rss: 70Mb L: 2/5 MS: 1 CrossOver- 00:08:12.863 [2024-04-26 20:03:57.193229] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.193256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.193350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.193366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.863 #20 NEW cov: 11886 ft: 14566 corp: 19/57b lim: 5 exec/s: 20 rss: 70Mb L: 2/5 MS: 1 EraseBytes- 00:08:12.863 [2024-04-26 20:03:57.254591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.254617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.254709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.254727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.254821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.254838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.254925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.254943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:12.863 [2024-04-26 20:03:57.255046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:12.863 [2024-04-26 20:03:57.255064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:12.863 #21 NEW cov: 11886 ft: 14583 corp: 20/62b lim: 5 exec/s: 21 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:08:13.122 [2024-04-26 20:03:57.314433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.314463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.314562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.314579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.314669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.314685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.314783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.314799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.122 #22 NEW cov: 11886 ft: 14663 corp: 21/66b lim: 5 exec/s: 22 rss: 70Mb L: 4/5 MS: 1 ChangeASCIIInt- 00:08:13.122 [2024-04-26 20:03:57.375258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.375285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.375387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.375403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.375497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.375513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.375602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.375617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.375715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.375730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.122 #23 NEW cov: 11886 ft: 14685 corp: 22/71b lim: 5 exec/s: 23 rss: 71Mb L: 5/5 MS: 1 ChangeBinInt- 00:08:13.122 [2024-04-26 20:03:57.435465] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.435490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.435582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.435599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.435694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.435713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.435805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.435820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.122 #24 NEW cov: 11886 ft: 14760 corp: 23/75b lim: 5 exec/s: 24 rss: 71Mb L: 4/5 MS: 1 ChangeBit- 00:08:13.122 [2024-04-26 20:03:57.484628] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.484656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.122 #25 NEW cov: 11886 ft: 14776 corp: 24/76b lim: 5 exec/s: 25 rss: 71Mb L: 1/5 MS: 1 EraseBytes- 00:08:13.122 [2024-04-26 20:03:57.535569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.535595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.535694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.535709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.122 [2024-04-26 20:03:57.535822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.122 [2024-04-26 20:03:57.535839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.122 #26 NEW cov: 11886 ft: 14789 corp: 25/79b lim: 5 exec/s: 26 rss: 71Mb L: 3/5 MS: 1 EraseBytes- 00:08:13.381 [2024-04-26 20:03:57.586640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.586668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.586764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.586782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.586877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.586893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.586987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.587004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.587098] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.587115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.381 #27 NEW cov: 11886 ft: 14807 corp: 26/84b lim: 5 exec/s: 27 rss: 71Mb L: 5/5 MS: 1 CrossOver- 00:08:13.381 [2024-04-26 20:03:57.646625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.646653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.646756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.646773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.646867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.646887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.646970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.646986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.381 #28 NEW cov: 11886 ft: 14827 corp: 27/88b lim: 5 exec/s: 28 rss: 71Mb L: 4/5 MS: 1 ChangeByte- 00:08:13.381 [2024-04-26 20:03:57.697236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.697261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.697344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.697360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.697450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.697466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.697553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.697570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.697668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.697684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.381 #29 NEW cov: 11887 ft: 14838 corp: 28/93b lim: 5 exec/s: 29 rss: 71Mb L: 5/5 MS: 1 CopyPart- 00:08:13.381 [2024-04-26 20:03:57.756369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.756394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.756480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.756496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.381 #30 NEW cov: 11887 ft: 14842 corp: 29/95b lim: 5 exec/s: 30 rss: 71Mb L: 2/5 MS: 1 ChangeBinInt- 00:08:13.381 [2024-04-26 20:03:57.807629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.807654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.807744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.807761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.807857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.807876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.807982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.807999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:13.381 [2024-04-26 20:03:57.808092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:13.381 [2024-04-26 20:03:57.808107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:13.641 #31 NEW cov: 11887 ft: 14846 corp: 30/100b lim: 5 exec/s: 15 rss: 71Mb L: 5/5 MS: 1 ChangeBit- 00:08:13.641 #31 DONE cov: 11887 ft: 14846 corp: 30/100b lim: 5 exec/s: 15 rss: 71Mb 00:08:13.641 Done 31 runs in 2 second(s) 00:08:13.641 20:03:57 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_8.conf /var/tmp/suppress_nvmf_fuzz 00:08:13.641 20:03:57 -- ../common.sh@72 -- # (( i++ )) 00:08:13.641 20:03:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:13.641 20:03:57 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:08:13.641 20:03:57 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:08:13.641 20:03:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:13.641 20:03:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:13.641 20:03:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.641 20:03:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:08:13.641 20:03:57 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:13.641 20:03:57 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:13.641 20:03:57 -- nvmf/run.sh@34 -- # printf %02d 9 00:08:13.641 20:03:57 -- nvmf/run.sh@34 -- # port=4409 00:08:13.641 20:03:57 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:13.641 20:03:57 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:08:13.641 20:03:57 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:13.641 20:03:57 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:13.641 20:03:57 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:13.641 20:03:57 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 00:08:13.641 [2024-04-26 20:03:58.011157] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:13.641 [2024-04-26 20:03:58.011243] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624262 ] 00:08:13.641 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.899 [2024-04-26 20:03:58.330865] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.157 [2024-04-26 20:03:58.416638] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.158 [2024-04-26 20:03:58.475971] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:14.158 [2024-04-26 20:03:58.492177] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:08:14.158 INFO: Running with entropic power schedule (0xFF, 100). 00:08:14.158 INFO: Seed: 3463785134 00:08:14.158 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:14.158 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:14.158 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:08:14.158 INFO: A corpus is not provided, starting from an empty corpus 00:08:14.158 [2024-04-26 20:03:58.537013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.158 [2024-04-26 20:03:58.537048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.158 #2 INITED cov: 11642 ft: 11643 corp: 1/1b exec/s: 0 rss: 68Mb 00:08:14.158 [2024-04-26 20:03:58.586991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.158 [2024-04-26 20:03:58.587034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.158 [2024-04-26 20:03:58.587084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.158 [2024-04-26 20:03:58.587100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.416 #3 NEW cov: 11772 ft: 12916 corp: 2/3b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:08:14.416 [2024-04-26 20:03:58.657170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.657202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.416 [2024-04-26 20:03:58.657251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.657267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.416 #4 NEW cov: 11778 ft: 13106 corp: 3/5b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:08:14.416 [2024-04-26 20:03:58.727416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.727445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.416 [2024-04-26 20:03:58.727493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.727509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.416 [2024-04-26 20:03:58.727539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.727555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.416 #5 NEW cov: 11863 ft: 13549 corp: 4/8b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:08:14.416 [2024-04-26 20:03:58.787520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.787551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.416 [2024-04-26 20:03:58.787584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.416 [2024-04-26 20:03:58.787599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.416 #6 NEW cov: 11863 ft: 13711 corp: 5/10b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBit- 00:08:14.417 [2024-04-26 20:03:58.837771] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.417 [2024-04-26 20:03:58.837800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.417 [2024-04-26 20:03:58.837833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.417 [2024-04-26 20:03:58.837849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.417 [2024-04-26 20:03:58.837888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.417 [2024-04-26 20:03:58.837904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.417 [2024-04-26 20:03:58.837933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.417 [2024-04-26 20:03:58.837964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:14.675 #7 NEW cov: 11863 ft: 14087 corp: 6/14b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:14.675 [2024-04-26 20:03:58.897761] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:58.897791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.675 [2024-04-26 20:03:58.897824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:58.897840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.675 #8 NEW cov: 11863 ft: 14191 corp: 7/16b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 EraseBytes- 00:08:14.675 [2024-04-26 20:03:58.967952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:58.967981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.675 [2024-04-26 20:03:58.968029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:58.968045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.675 #9 NEW cov: 11863 ft: 14228 corp: 8/18b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 ShuffleBytes- 00:08:14.675 [2024-04-26 20:03:59.038208] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:59.038242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.675 [2024-04-26 20:03:59.038292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:59.038308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.675 [2024-04-26 20:03:59.038338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:59.038354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.675 #10 NEW cov: 11863 ft: 14262 corp: 9/21b lim: 5 exec/s: 0 rss: 69Mb L: 3/4 MS: 1 ChangeBit- 00:08:14.675 [2024-04-26 20:03:59.088237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.675 [2024-04-26 20:03:59.088267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.934 #11 NEW cov: 11863 ft: 14302 corp: 10/22b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 EraseBytes- 00:08:14.934 [2024-04-26 20:03:59.148430] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.148460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.934 [2024-04-26 20:03:59.148508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.148524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.934 #12 NEW cov: 11863 ft: 14328 corp: 11/24b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 CopyPart- 00:08:14.934 [2024-04-26 20:03:59.198475] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.198503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.934 #13 NEW cov: 11863 ft: 14363 corp: 12/25b lim: 5 exec/s: 0 rss: 69Mb L: 1/4 MS: 1 ChangeByte- 00:08:14.934 [2024-04-26 20:03:59.248717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.248745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.934 [2024-04-26 20:03:59.248793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.248809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.934 [2024-04-26 20:03:59.248838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.248853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:14.934 #14 NEW cov: 11863 ft: 14416 corp: 13/28b lim: 5 exec/s: 0 rss: 69Mb L: 3/4 MS: 1 ShuffleBytes- 00:08:14.934 [2024-04-26 20:03:59.318886] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.318915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:14.934 [2024-04-26 20:03:59.318952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:14.934 [2024-04-26 20:03:59.318968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:14.934 #15 NEW cov: 11863 ft: 14438 corp: 14/30b lim: 5 exec/s: 0 rss: 69Mb L: 2/4 MS: 1 ChangeBinInt- 00:08:15.193 [2024-04-26 20:03:59.389209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.193 [2024-04-26 20:03:59.389239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.193 [2024-04-26 20:03:59.389272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.193 [2024-04-26 20:03:59.389287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.193 [2024-04-26 20:03:59.389317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.193 [2024-04-26 20:03:59.389332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.193 [2024-04-26 20:03:59.389376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.193 [2024-04-26 20:03:59.389392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.451 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:15.451 #16 NEW cov: 11886 ft: 14514 corp: 15/34b lim: 5 exec/s: 16 rss: 70Mb L: 4/4 MS: 1 InsertByte- 00:08:15.451 [2024-04-26 20:03:59.752632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.752679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.752785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.752805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.451 #17 NEW cov: 11886 ft: 14612 corp: 16/36b lim: 5 exec/s: 17 rss: 70Mb L: 2/4 MS: 1 ChangeBit- 00:08:15.451 [2024-04-26 20:03:59.803577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.803607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.803704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.803723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.803813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.803832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.803933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.803956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.451 #18 NEW cov: 11886 ft: 14653 corp: 17/40b lim: 5 exec/s: 18 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:08:15.451 [2024-04-26 20:03:59.853497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.853522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.853612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.451 [2024-04-26 20:03:59.853629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.451 [2024-04-26 20:03:59.853717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.452 [2024-04-26 20:03:59.853732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.452 #19 NEW cov: 11886 ft: 14661 corp: 18/43b lim: 5 exec/s: 19 rss: 70Mb L: 3/4 MS: 1 ChangeBinInt- 00:08:15.710 [2024-04-26 20:03:59.904132] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.904159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:03:59.904254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.904270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:03:59.904363] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.904378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:03:59.904469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.904483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.710 #20 NEW cov: 11886 ft: 14695 corp: 19/47b lim: 5 exec/s: 20 rss: 70Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:08:15.710 [2024-04-26 20:03:59.964016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.964041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:03:59.964149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.964165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:03:59.964261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:03:59.964278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.710 #21 NEW cov: 11886 ft: 14705 corp: 20/50b lim: 5 exec/s: 21 rss: 70Mb L: 3/4 MS: 1 InsertByte- 00:08:15.710 [2024-04-26 20:04:00.024830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.024859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.024957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.024975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.025069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.025086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.025179] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.025195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.710 #22 NEW cov: 11886 ft: 14717 corp: 21/54b lim: 5 exec/s: 22 rss: 70Mb L: 4/4 MS: 1 InsertByte- 00:08:15.710 [2024-04-26 20:04:00.075254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.075285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.075367] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.075384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.075468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.075484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.075576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.075592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:15.710 #23 NEW cov: 11886 ft: 14729 corp: 22/58b lim: 5 exec/s: 23 rss: 70Mb L: 4/4 MS: 1 CrossOver- 00:08:15.710 [2024-04-26 20:04:00.125144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.125172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.125258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.125276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.710 [2024-04-26 20:04:00.125361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.710 [2024-04-26 20:04:00.125376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.968 #24 NEW cov: 11886 ft: 14779 corp: 23/61b lim: 5 exec/s: 24 rss: 70Mb L: 3/4 MS: 1 ShuffleBytes- 00:08:15.968 [2024-04-26 20:04:00.185472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.185497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.185582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.185597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.185689] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.185703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.968 #25 NEW cov: 11886 ft: 14785 corp: 24/64b lim: 5 exec/s: 25 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:08:15.968 [2024-04-26 20:04:00.235826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.235851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.235941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.235958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.236046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.236062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:15.968 #26 NEW cov: 11886 ft: 14800 corp: 25/67b lim: 5 exec/s: 26 rss: 71Mb L: 3/4 MS: 1 InsertByte- 00:08:15.968 [2024-04-26 20:04:00.295751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.295775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.295877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.295893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.968 #27 NEW cov: 11886 ft: 14808 corp: 26/69b lim: 5 exec/s: 27 rss: 71Mb L: 2/4 MS: 1 CopyPart- 00:08:15.968 [2024-04-26 20:04:00.345953] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.968 [2024-04-26 20:04:00.345978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.968 [2024-04-26 20:04:00.346057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.969 [2024-04-26 20:04:00.346074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.969 #28 NEW cov: 11886 ft: 14848 corp: 27/71b lim: 5 exec/s: 28 rss: 71Mb L: 2/4 MS: 1 CrossOver- 00:08:15.969 [2024-04-26 20:04:00.396823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.969 [2024-04-26 20:04:00.396847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:15.969 [2024-04-26 20:04:00.396937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.969 [2024-04-26 20:04:00.396953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:15.969 [2024-04-26 20:04:00.397039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:15.969 [2024-04-26 20:04:00.397053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.227 #29 NEW cov: 11886 ft: 14863 corp: 28/74b lim: 5 exec/s: 29 rss: 71Mb L: 3/4 MS: 1 ChangeByte- 00:08:16.227 [2024-04-26 20:04:00.447133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.227 [2024-04-26 20:04:00.447158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.227 [2024-04-26 20:04:00.447251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.227 [2024-04-26 20:04:00.447268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.227 [2024-04-26 20:04:00.447356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.227 [2024-04-26 20:04:00.447371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:16.227 #30 NEW cov: 11887 ft: 14887 corp: 29/77b lim: 5 exec/s: 30 rss: 71Mb L: 3/4 MS: 1 EraseBytes- 00:08:16.227 [2024-04-26 20:04:00.507027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.227 [2024-04-26 20:04:00.507052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:16.227 [2024-04-26 20:04:00.507150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:16.227 [2024-04-26 20:04:00.507166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:16.227 #31 NEW cov: 11887 ft: 14896 corp: 30/79b lim: 5 exec/s: 15 rss: 71Mb L: 2/4 MS: 1 CrossOver- 00:08:16.227 #31 DONE cov: 11887 ft: 14896 corp: 30/79b lim: 5 exec/s: 15 rss: 71Mb 00:08:16.227 Done 31 runs in 2 second(s) 00:08:16.227 20:04:00 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_9.conf /var/tmp/suppress_nvmf_fuzz 00:08:16.227 20:04:00 -- ../common.sh@72 -- # (( i++ )) 00:08:16.227 20:04:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:16.227 20:04:00 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:08:16.227 20:04:00 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:08:16.227 20:04:00 -- nvmf/run.sh@24 -- # local timen=1 00:08:16.227 20:04:00 -- nvmf/run.sh@25 -- # local core=0x1 00:08:16.227 20:04:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.227 20:04:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:08:16.227 20:04:00 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:16.486 20:04:00 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:16.486 20:04:00 -- nvmf/run.sh@34 -- # printf %02d 10 00:08:16.486 20:04:00 -- nvmf/run.sh@34 -- # port=4410 00:08:16.486 20:04:00 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:16.486 20:04:00 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:08:16.486 20:04:00 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:16.486 20:04:00 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:16.486 20:04:00 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:16.486 20:04:00 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 00:08:16.486 [2024-04-26 20:04:00.715289] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:16.486 [2024-04-26 20:04:00.715358] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624621 ] 00:08:16.486 EAL: No free 2048 kB hugepages reported on node 1 00:08:16.744 [2024-04-26 20:04:01.033356] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:16.744 [2024-04-26 20:04:01.122589] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:16.745 [2024-04-26 20:04:01.182036] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:17.003 [2024-04-26 20:04:01.198250] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:08:17.003 INFO: Running with entropic power schedule (0xFF, 100). 00:08:17.003 INFO: Seed: 1872833123 00:08:17.003 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:17.003 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:17.003 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:08:17.003 INFO: A corpus is not provided, starting from an empty corpus 00:08:17.003 #2 INITED exec/s: 0 rss: 63Mb 00:08:17.003 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:17.003 This may also happen if the target rejected all inputs we tried so far 00:08:17.003 [2024-04-26 20:04:01.246885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.003 [2024-04-26 20:04:01.246921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.003 [2024-04-26 20:04:01.246957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.003 [2024-04-26 20:04:01.246974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.003 [2024-04-26 20:04:01.247006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.003 [2024-04-26 20:04:01.247022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.260 NEW_FUNC[1/670]: 0x48eb70 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:08:17.260 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:17.260 #16 NEW cov: 11665 ft: 11666 corp: 2/27b lim: 40 exec/s: 0 rss: 69Mb L: 26/26 MS: 4 CopyPart-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:17.260 [2024-04-26 20:04:01.630218] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.260 [2024-04-26 20:04:01.630272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.260 [2024-04-26 20:04:01.630373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.260 [2024-04-26 20:04:01.630395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.260 [2024-04-26 20:04:01.630491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.260 [2024-04-26 20:04:01.630512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.260 #32 NEW cov: 11795 ft: 12321 corp: 3/53b lim: 40 exec/s: 0 rss: 69Mb L: 26/26 MS: 1 CrossOver- 00:08:17.260 [2024-04-26 20:04:01.690347] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000a0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.260 [2024-04-26 20:04:01.690374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.260 [2024-04-26 20:04:01.690462] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.261 [2024-04-26 20:04:01.690478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.261 [2024-04-26 20:04:01.690570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.261 [2024-04-26 20:04:01.690585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.519 #33 NEW cov: 11801 ft: 12453 corp: 4/80b lim: 40 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CrossOver- 00:08:17.519 [2024-04-26 20:04:01.740509] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.740533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.740620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.740635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.740720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.740736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.519 #41 NEW cov: 11886 ft: 12664 corp: 5/109b lim: 40 exec/s: 0 rss: 69Mb L: 29/29 MS: 3 CopyPart-ShuffleBytes-InsertRepeatedBytes- 00:08:17.519 [2024-04-26 20:04:01.790697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.790721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.790821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.790836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.790928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.790945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.519 #42 NEW cov: 11886 ft: 12762 corp: 6/135b lim: 40 exec/s: 0 rss: 69Mb L: 26/29 MS: 1 ChangeBit- 00:08:17.519 [2024-04-26 20:04:01.850830] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.850855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.850960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.850977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.851067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.851082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.519 #43 NEW cov: 11886 ft: 12921 corp: 7/161b lim: 40 exec/s: 0 rss: 69Mb L: 26/29 MS: 1 CrossOver- 00:08:17.519 [2024-04-26 20:04:01.901401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:a1a1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.901427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.901519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:a1a1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.901535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.901630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a1a1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.901645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.901737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:a1a1a1a1 cdw11:a1a1a1a1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.901752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:17.519 #45 NEW cov: 11886 ft: 13435 corp: 8/196b lim: 40 exec/s: 0 rss: 69Mb L: 35/35 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:17.519 [2024-04-26 20:04:01.951287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.951310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.951396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.951413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.519 [2024-04-26 20:04:01.951503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:1a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.519 [2024-04-26 20:04:01.951517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.778 #46 NEW cov: 11886 ft: 13459 corp: 9/222b lim: 40 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 ChangeBinInt- 00:08:17.778 [2024-04-26 20:04:02.001291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.001315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.001416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.001432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.001519] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.001534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.778 #47 NEW cov: 11886 ft: 13503 corp: 10/248b lim: 40 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 ShuffleBytes- 00:08:17.778 [2024-04-26 20:04:02.051724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.051749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.051844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a0a0a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.051860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.051946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:1a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.051962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.778 #48 NEW cov: 11886 ft: 13550 corp: 11/274b lim: 40 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 CrossOver- 00:08:17.778 [2024-04-26 20:04:02.111876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.111900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.111997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.112014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.112100] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d07008d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.112116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.778 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:17.778 #49 NEW cov: 11903 ft: 13590 corp: 12/303b lim: 40 exec/s: 0 rss: 69Mb L: 29/35 MS: 1 CMP- DE: "\007\000"- 00:08:17.778 [2024-04-26 20:04:02.172025] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:1a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.778 [2024-04-26 20:04:02.172052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:17.778 [2024-04-26 20:04:02.172135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.779 [2024-04-26 20:04:02.172154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:17.779 [2024-04-26 20:04:02.172240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:17.779 [2024-04-26 20:04:02.172258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:17.779 #50 NEW cov: 11903 ft: 13723 corp: 13/329b lim: 40 exec/s: 0 rss: 69Mb L: 26/35 MS: 1 ChangeBinInt- 00:08:18.037 [2024-04-26 20:04:02.222011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.222036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.222127] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d07 cdw11:008d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.222142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.037 #51 NEW cov: 11903 ft: 13965 corp: 14/348b lim: 40 exec/s: 51 rss: 70Mb L: 19/35 MS: 1 EraseBytes- 00:08:18.037 [2024-04-26 20:04:02.282674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.282700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.282797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.282813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.282904] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.282918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.283012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.283026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.037 #54 NEW cov: 11903 ft: 13980 corp: 15/380b lim: 40 exec/s: 54 rss: 70Mb L: 32/35 MS: 3 ChangeByte-ChangeByte-InsertRepeatedBytes- 00:08:18.037 [2024-04-26 20:04:02.332525] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.332549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.332635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.332652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.332737] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.332752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.037 #55 NEW cov: 11903 ft: 14055 corp: 16/406b lim: 40 exec/s: 55 rss: 70Mb L: 26/35 MS: 1 EraseBytes- 00:08:18.037 [2024-04-26 20:04:02.382756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.382779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.382870] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:0a0a0a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.382890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.382997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:1a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.383012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.037 #56 NEW cov: 11903 ft: 14077 corp: 17/432b lim: 40 exec/s: 56 rss: 70Mb L: 26/35 MS: 1 ShuffleBytes- 00:08:18.037 [2024-04-26 20:04:02.442895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.442920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.443007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.443033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.037 [2024-04-26 20:04:02.443124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.037 [2024-04-26 20:04:02.443140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.037 #57 NEW cov: 11903 ft: 14153 corp: 18/458b lim: 40 exec/s: 57 rss: 70Mb L: 26/35 MS: 1 CopyPart- 00:08:18.296 [2024-04-26 20:04:02.493192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.493217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.493312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.493328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.493412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.493428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.296 #58 NEW cov: 11903 ft: 14229 corp: 19/489b lim: 40 exec/s: 58 rss: 70Mb L: 31/35 MS: 1 InsertRepeatedBytes- 00:08:18.296 [2024-04-26 20:04:02.543204] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.543229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.543312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.543332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.543413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:e2000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.543426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.296 #59 NEW cov: 11903 ft: 14230 corp: 20/515b lim: 40 exec/s: 59 rss: 70Mb L: 26/35 MS: 1 ChangeByte- 00:08:18.296 [2024-04-26 20:04:02.593658] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.593681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.593763] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d01 cdw11:0adebabb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.593780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.296 [2024-04-26 20:04:02.593875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:a19e968d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.296 [2024-04-26 20:04:02.593891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.593991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:8d8d8d8d cdw11:8d07008d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.594006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.297 #60 NEW cov: 11903 ft: 14249 corp: 21/552b lim: 40 exec/s: 60 rss: 70Mb L: 37/37 MS: 1 CMP- DE: "\001\012\336\272\273\241\236\226"- 00:08:18.297 [2024-04-26 20:04:02.643595] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8dbb SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.643620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.643710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.643727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.643823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d0700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.643837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.297 #61 NEW cov: 11903 ft: 14291 corp: 22/582b lim: 40 exec/s: 61 rss: 70Mb L: 30/37 MS: 1 InsertByte- 00:08:18.297 [2024-04-26 20:04:02.693989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.694014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.694104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:0000ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.694119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.694211] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:debac965 cdw11:a6de0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.694230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.297 [2024-04-26 20:04:02.694326] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.297 [2024-04-26 20:04:02.694342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.297 #62 NEW cov: 11903 ft: 14304 corp: 23/616b lim: 40 exec/s: 62 rss: 70Mb L: 34/37 MS: 1 CMP- DE: "\377\011\336\272\311e\246\336"- 00:08:18.556 [2024-04-26 20:04:02.753901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.753927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.556 [2024-04-26 20:04:02.754023] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.754040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.556 [2024-04-26 20:04:02.754140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a008d8d cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.754155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.556 #63 NEW cov: 11903 ft: 14310 corp: 24/642b lim: 40 exec/s: 63 rss: 70Mb L: 26/37 MS: 1 CrossOver- 00:08:18.556 [2024-04-26 20:04:02.804391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.804416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.556 [2024-04-26 20:04:02.804516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.804533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.556 [2024-04-26 20:04:02.804622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.556 [2024-04-26 20:04:02.804638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.804729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.804745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.557 #64 NEW cov: 11903 ft: 14329 corp: 25/676b lim: 40 exec/s: 64 rss: 70Mb L: 34/37 MS: 1 CrossOver- 00:08:18.557 [2024-04-26 20:04:02.854355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:00000300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.854380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.854463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.854480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.854564] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:0a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.854579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.557 #65 NEW cov: 11903 ft: 14359 corp: 26/702b lim: 40 exec/s: 65 rss: 70Mb L: 26/37 MS: 1 CMP- DE: "\003\000\000\000"- 00:08:18.557 [2024-04-26 20:04:02.904449] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:0000000a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.904474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.904566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.904584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.904673] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.904688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.557 #66 NEW cov: 11903 ft: 14389 corp: 27/729b lim: 40 exec/s: 66 rss: 70Mb L: 27/37 MS: 1 ChangeBinInt- 00:08:18.557 [2024-04-26 20:04:02.965308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:0000f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.965332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.965417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:f8f8f8f8 cdw11:f8f8f8f8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.965432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.965522] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:f8f8f8f8 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.965537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.965624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.965639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.557 [2024-04-26 20:04:02.965717] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00001a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.557 [2024-04-26 20:04:02.965733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.557 #67 NEW cov: 11903 ft: 14465 corp: 28/769b lim: 40 exec/s: 67 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:18.816 [2024-04-26 20:04:03.015396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.015421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.015516] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10000000 cdw11:0000ff09 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.015532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.015623] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:debac965 cdw11:a6de0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.015638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.015724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.015740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.015824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.015838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:18.816 #68 NEW cov: 11903 ft: 14484 corp: 29/809b lim: 40 exec/s: 68 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:08:18.816 [2024-04-26 20:04:03.074784] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.074809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.074890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d07 cdw11:008d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.074917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.816 #69 NEW cov: 11903 ft: 14507 corp: 30/828b lim: 40 exec/s: 69 rss: 70Mb L: 19/40 MS: 1 ShuffleBytes- 00:08:18.816 [2024-04-26 20:04:03.134856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0a00 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.134884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.134984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:10de0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.135001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.816 #70 NEW cov: 11910 ft: 14526 corp: 31/850b lim: 40 exec/s: 70 rss: 70Mb L: 22/40 MS: 1 EraseBytes- 00:08:18.816 [2024-04-26 20:04:03.185452] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:0a0a0000 cdw11:f8ff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.185476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.185573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.185589] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.185679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:1a000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.185694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:18.816 #71 NEW cov: 11910 ft: 14547 corp: 32/876b lim: 40 exec/s: 71 rss: 70Mb L: 26/40 MS: 1 ChangeBinInt- 00:08:18.816 [2024-04-26 20:04:03.235555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.235582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.235668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.235684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:18.816 [2024-04-26 20:04:03.235772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:8d8d8d8d cdw11:8d8d8d8d SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:18.816 [2024-04-26 20:04:03.235788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.075 #72 NEW cov: 11910 ft: 14558 corp: 33/902b lim: 40 exec/s: 36 rss: 70Mb L: 26/40 MS: 1 CopyPart- 00:08:19.075 #72 DONE cov: 11910 ft: 14558 corp: 33/902b lim: 40 exec/s: 36 rss: 70Mb 00:08:19.075 ###### Recommended dictionary. ###### 00:08:19.075 "\007\000" # Uses: 0 00:08:19.075 "\001\012\336\272\273\241\236\226" # Uses: 0 00:08:19.075 "\377\011\336\272\311e\246\336" # Uses: 0 00:08:19.075 "\003\000\000\000" # Uses: 0 00:08:19.075 ###### End of recommended dictionary. ###### 00:08:19.075 Done 72 runs in 2 second(s) 00:08:19.075 20:04:03 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_10.conf /var/tmp/suppress_nvmf_fuzz 00:08:19.075 20:04:03 -- ../common.sh@72 -- # (( i++ )) 00:08:19.075 20:04:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:19.075 20:04:03 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:08:19.075 20:04:03 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:08:19.075 20:04:03 -- nvmf/run.sh@24 -- # local timen=1 00:08:19.075 20:04:03 -- nvmf/run.sh@25 -- # local core=0x1 00:08:19.075 20:04:03 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.075 20:04:03 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:08:19.075 20:04:03 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:19.075 20:04:03 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:19.075 20:04:03 -- nvmf/run.sh@34 -- # printf %02d 11 00:08:19.075 20:04:03 -- nvmf/run.sh@34 -- # port=4411 00:08:19.075 20:04:03 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.075 20:04:03 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:08:19.075 20:04:03 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:19.075 20:04:03 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:19.075 20:04:03 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:19.075 20:04:03 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 00:08:19.075 [2024-04-26 20:04:03.432744] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:19.075 [2024-04-26 20:04:03.432835] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624987 ] 00:08:19.075 EAL: No free 2048 kB hugepages reported on node 1 00:08:19.334 [2024-04-26 20:04:03.623819] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:19.334 [2024-04-26 20:04:03.694379] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:19.334 [2024-04-26 20:04:03.753811] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:19.334 [2024-04-26 20:04:03.770023] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:08:19.593 INFO: Running with entropic power schedule (0xFF, 100). 00:08:19.593 INFO: Seed: 151858891 00:08:19.593 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:19.593 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:19.593 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:08:19.593 INFO: A corpus is not provided, starting from an empty corpus 00:08:19.593 #2 INITED exec/s: 0 rss: 63Mb 00:08:19.593 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:19.593 This may also happen if the target rejected all inputs we tried so far 00:08:19.593 [2024-04-26 20:04:03.824918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.593 [2024-04-26 20:04:03.824954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.593 [2024-04-26 20:04:03.824989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.593 [2024-04-26 20:04:03.825006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.853 NEW_FUNC[1/671]: 0x4908e0 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:08:19.853 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:19.853 #4 NEW cov: 11661 ft: 11675 corp: 2/17b lim: 40 exec/s: 0 rss: 69Mb L: 16/16 MS: 2 ChangeBinInt-InsertRepeatedBytes- 00:08:19.853 [2024-04-26 20:04:04.198419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.198467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.853 [2024-04-26 20:04:04.198568] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5bfa5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.198588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.853 #5 NEW cov: 11807 ft: 12262 corp: 3/33b lim: 40 exec/s: 0 rss: 69Mb L: 16/16 MS: 1 ChangeByte- 00:08:19.853 [2024-04-26 20:04:04.259065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.259096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:19.853 [2024-04-26 20:04:04.259199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.259219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:19.853 [2024-04-26 20:04:04.259314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.259331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:19.853 [2024-04-26 20:04:04.259435] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:19.853 [2024-04-26 20:04:04.259453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:19.853 #6 NEW cov: 11813 ft: 12794 corp: 4/69b lim: 40 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:20.112 [2024-04-26 20:04:04.309071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.309102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.309199] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.309217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.309304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.309320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.309411] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.309426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.112 #7 NEW cov: 11898 ft: 13075 corp: 5/108b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 CrossOver- 00:08:20.112 [2024-04-26 20:04:04.368183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aeeeeee cdw11:eeeeeeee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.368209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.112 #8 NEW cov: 11898 ft: 13862 corp: 6/120b lim: 40 exec/s: 0 rss: 69Mb L: 12/39 MS: 1 InsertRepeatedBytes- 00:08:20.112 [2024-04-26 20:04:04.418690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.418715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.418803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a1a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.418817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.112 #9 NEW cov: 11898 ft: 13911 corp: 7/136b lim: 40 exec/s: 0 rss: 69Mb L: 16/39 MS: 1 ChangeBit- 00:08:20.112 [2024-04-26 20:04:04.469494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.469518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.469607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.469623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.469703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.469718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.469806] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcb6dc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.469821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.112 #10 NEW cov: 11898 ft: 13958 corp: 8/173b lim: 40 exec/s: 0 rss: 69Mb L: 37/39 MS: 1 InsertByte- 00:08:20.112 [2024-04-26 20:04:04.519760] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.519787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.519871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.112 [2024-04-26 20:04:04.519890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.112 [2024-04-26 20:04:04.519988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.113 [2024-04-26 20:04:04.520002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.113 [2024-04-26 20:04:04.520085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.113 [2024-04-26 20:04:04.520100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.113 #11 NEW cov: 11898 ft: 13998 corp: 9/209b lim: 40 exec/s: 0 rss: 69Mb L: 36/39 MS: 1 ShuffleBytes- 00:08:20.372 [2024-04-26 20:04:04.569865] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.569896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.569986] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.570002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.570092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.570109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.570203] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.570219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.372 #12 NEW cov: 11898 ft: 14020 corp: 10/248b lim: 40 exec/s: 0 rss: 69Mb L: 39/39 MS: 1 CopyPart- 00:08:20.372 [2024-04-26 20:04:04.630464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.630488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.630582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.630598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.630680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.630695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.630783] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.630800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.630897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:dcdcdca5 cdw11:a5a5dc0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.630912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.372 #13 NEW cov: 11898 ft: 14141 corp: 11/288b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:08:20.372 [2024-04-26 20:04:04.680317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.680342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.680436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.680451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.680537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.680552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.680645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.680659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.372 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:20.372 #14 NEW cov: 11921 ft: 14193 corp: 12/327b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 ShuffleBytes- 00:08:20.372 [2024-04-26 20:04:04.740563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.740588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.740695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.740711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.740800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.740816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.740918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.740934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.372 #15 NEW cov: 11921 ft: 14203 corp: 13/366b lim: 40 exec/s: 0 rss: 69Mb L: 39/40 MS: 1 ShuffleBytes- 00:08:20.372 [2024-04-26 20:04:04.790624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.790650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.790744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.790759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.790863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.790883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.372 [2024-04-26 20:04:04.790972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.372 [2024-04-26 20:04:04.790987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.631 #16 NEW cov: 11921 ft: 14221 corp: 14/405b lim: 40 exec/s: 16 rss: 69Mb L: 39/40 MS: 1 CopyPart- 00:08:20.631 [2024-04-26 20:04:04.849838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:04.849864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.631 #17 NEW cov: 11921 ft: 14281 corp: 15/416b lim: 40 exec/s: 17 rss: 69Mb L: 11/40 MS: 1 EraseBytes- 00:08:20.631 [2024-04-26 20:04:04.900350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a59ea5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:04.900375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.631 [2024-04-26 20:04:04.900461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5bfa5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:04.900477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.631 #18 NEW cov: 11921 ft: 14304 corp: 16/432b lim: 40 exec/s: 18 rss: 70Mb L: 16/40 MS: 1 ChangeBinInt- 00:08:20.631 [2024-04-26 20:04:04.960176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aeeeeee cdw11:eeceeeee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:04.960201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.631 #19 NEW cov: 11921 ft: 14338 corp: 17/444b lim: 40 exec/s: 19 rss: 70Mb L: 12/40 MS: 1 ChangeBit- 00:08:20.631 [2024-04-26 20:04:05.021434] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:05.021459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.631 [2024-04-26 20:04:05.021544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:05.021559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.631 [2024-04-26 20:04:05.021655] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:05.021670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.631 [2024-04-26 20:04:05.021754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:b6dcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.631 [2024-04-26 20:04:05.021771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.631 #20 NEW cov: 11921 ft: 14363 corp: 18/483b lim: 40 exec/s: 20 rss: 70Mb L: 39/40 MS: 1 CopyPart- 00:08:20.890 [2024-04-26 20:04:05.080994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.081019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.081108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a1a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.081124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.890 #21 NEW cov: 11921 ft: 14439 corp: 19/501b lim: 40 exec/s: 21 rss: 70Mb L: 18/40 MS: 1 CopyPart- 00:08:20.890 [2024-04-26 20:04:05.142264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.142291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.142381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:5bdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.142397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.142480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.142495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.142582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.142598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.142684] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:dcdcdca5 cdw11:a5a5dc0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.142700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.890 #22 NEW cov: 11921 ft: 14471 corp: 20/541b lim: 40 exec/s: 22 rss: 70Mb L: 40/40 MS: 1 InsertByte- 00:08:20.890 [2024-04-26 20:04:05.202389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.202413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.202501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dc1edcdc cdw11:5bdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.202520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.202610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.202625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.202712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.202728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:20.890 [2024-04-26 20:04:05.202818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:8 nsid:0 cdw10:dcdcdca5 cdw11:a5a5dc0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.202835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:20.890 #23 NEW cov: 11921 ft: 14493 corp: 21/581b lim: 40 exec/s: 23 rss: 70Mb L: 40/40 MS: 1 ChangeBinInt- 00:08:20.890 [2024-04-26 20:04:05.261145] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5f6a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.261171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:20.890 #24 NEW cov: 11921 ft: 14568 corp: 22/595b lim: 40 exec/s: 24 rss: 70Mb L: 14/40 MS: 1 CrossOver- 00:08:20.890 [2024-04-26 20:04:05.311396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aeeeeee cdw11:eeceeeee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:20.890 [2024-04-26 20:04:05.311423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 #25 NEW cov: 11921 ft: 14603 corp: 23/607b lim: 40 exec/s: 25 rss: 70Mb L: 12/40 MS: 1 ChangeByte- 00:08:21.148 [2024-04-26 20:04:05.372640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcffdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.372667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.372766] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.372785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.372877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.372893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.372990] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.373007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.148 #26 NEW cov: 11921 ft: 14620 corp: 24/643b lim: 40 exec/s: 26 rss: 70Mb L: 36/40 MS: 1 ChangeByte- 00:08:21.148 [2024-04-26 20:04:05.421958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:48aa0ecc cdw11:b17f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.421984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 #27 NEW cov: 11921 ft: 14655 corp: 25/652b lim: 40 exec/s: 27 rss: 70Mb L: 9/40 MS: 1 CMP- DE: "H\252\016\314\261\177\000\000"- 00:08:21.148 [2024-04-26 20:04:05.472625] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:48aa0ecc cdw11:b17f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.472651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.472750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.472767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.472861] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:a5a5a5a5 cdw11:a1a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.472880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.148 #28 NEW cov: 11921 ft: 14853 corp: 26/678b lim: 40 exec/s: 28 rss: 70Mb L: 26/40 MS: 1 PersAutoDict- DE: "H\252\016\314\261\177\000\000"- 00:08:21.148 [2024-04-26 20:04:05.532514] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.532540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.148 [2024-04-26 20:04:05.532635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.532652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.148 #29 NEW cov: 11921 ft: 14873 corp: 27/700b lim: 40 exec/s: 29 rss: 70Mb L: 22/40 MS: 1 CopyPart- 00:08:21.148 [2024-04-26 20:04:05.582387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0aeeeeee cdw11:eeceeeee SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.148 [2024-04-26 20:04:05.582413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.442 #30 NEW cov: 11921 ft: 14885 corp: 28/715b lim: 40 exec/s: 30 rss: 70Mb L: 15/40 MS: 1 InsertRepeatedBytes- 00:08:21.442 [2024-04-26 20:04:05.632785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a5a5a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.632810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.632914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.632930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.442 #31 NEW cov: 11921 ft: 14941 corp: 29/737b lim: 40 exec/s: 31 rss: 70Mb L: 22/40 MS: 1 ShuffleBytes- 00:08:21.442 [2024-04-26 20:04:05.693789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:dcdcdcdc cdw11:dcffdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.693814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.693913] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.693928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.694019] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.694034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.694129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:dcdcdcdc cdw11:dcdcdcdc SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.694146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:21.442 #32 NEW cov: 11921 ft: 14996 corp: 30/772b lim: 40 exec/s: 32 rss: 71Mb L: 35/40 MS: 1 EraseBytes- 00:08:21.442 [2024-04-26 20:04:05.753256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a4a5a5 cdw11:a5a5a5a1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.753284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.753373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a5a5 cdw11:a5a5a5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.753389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.442 #33 NEW cov: 11921 ft: 15027 corp: 31/794b lim: 40 exec/s: 33 rss: 71Mb L: 22/40 MS: 1 ChangeBit- 00:08:21.442 [2024-04-26 20:04:05.813439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6a5a5a5 cdw11:a5a59ea5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.813463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:21.442 [2024-04-26 20:04:05.813553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a5a5a585 cdw11:a5bfa5a5 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:21.442 [2024-04-26 20:04:05.813570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:21.442 #34 NEW cov: 11921 ft: 15037 corp: 32/810b lim: 40 exec/s: 17 rss: 71Mb L: 16/40 MS: 1 ChangeBit- 00:08:21.442 #34 DONE cov: 11921 ft: 15037 corp: 32/810b lim: 40 exec/s: 17 rss: 71Mb 00:08:21.442 ###### Recommended dictionary. ###### 00:08:21.442 "H\252\016\314\261\177\000\000" # Uses: 1 00:08:21.442 ###### End of recommended dictionary. ###### 00:08:21.442 Done 34 runs in 2 second(s) 00:08:21.700 20:04:05 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_11.conf /var/tmp/suppress_nvmf_fuzz 00:08:21.700 20:04:05 -- ../common.sh@72 -- # (( i++ )) 00:08:21.700 20:04:05 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:21.700 20:04:05 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:08:21.701 20:04:05 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:08:21.701 20:04:05 -- nvmf/run.sh@24 -- # local timen=1 00:08:21.701 20:04:05 -- nvmf/run.sh@25 -- # local core=0x1 00:08:21.701 20:04:05 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.701 20:04:05 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:08:21.701 20:04:05 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:21.701 20:04:05 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:21.701 20:04:05 -- nvmf/run.sh@34 -- # printf %02d 12 00:08:21.701 20:04:05 -- nvmf/run.sh@34 -- # port=4412 00:08:21.701 20:04:05 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.701 20:04:05 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:08:21.701 20:04:05 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:21.701 20:04:05 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:21.701 20:04:05 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:21.701 20:04:05 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 00:08:21.701 [2024-04-26 20:04:06.007074] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:21.701 [2024-04-26 20:04:06.007136] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625340 ] 00:08:21.701 EAL: No free 2048 kB hugepages reported on node 1 00:08:21.959 [2024-04-26 20:04:06.202910] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.959 [2024-04-26 20:04:06.275793] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.959 [2024-04-26 20:04:06.335107] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:21.959 [2024-04-26 20:04:06.351316] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:08:21.959 INFO: Running with entropic power schedule (0xFF, 100). 00:08:21.959 INFO: Seed: 2731866151 00:08:21.959 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:21.959 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:21.959 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:08:21.959 INFO: A corpus is not provided, starting from an empty corpus 00:08:21.959 #2 INITED exec/s: 0 rss: 63Mb 00:08:21.959 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:21.959 This may also happen if the target rejected all inputs we tried so far 00:08:22.217 [2024-04-26 20:04:06.406630] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.217 [2024-04-26 20:04:06.406659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 NEW_FUNC[1/671]: 0x492650 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:08:22.476 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:22.476 #6 NEW cov: 11675 ft: 11676 corp: 2/12b lim: 40 exec/s: 0 rss: 69Mb L: 11/11 MS: 4 ChangeBit-ChangeByte-ChangeBit-InsertRepeatedBytes- 00:08:22.476 [2024-04-26 20:04:06.727437] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.476 [2024-04-26 20:04:06.727471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 #7 NEW cov: 11805 ft: 12155 corp: 3/20b lim: 40 exec/s: 0 rss: 69Mb L: 8/11 MS: 1 CrossOver- 00:08:22.476 [2024-04-26 20:04:06.767413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.476 [2024-04-26 20:04:06.767439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 #8 NEW cov: 11811 ft: 12289 corp: 4/30b lim: 40 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 EraseBytes- 00:08:22.476 [2024-04-26 20:04:06.807581] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d9d3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.476 [2024-04-26 20:04:06.807607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 #9 NEW cov: 11896 ft: 12529 corp: 5/39b lim: 40 exec/s: 0 rss: 69Mb L: 9/11 MS: 1 InsertByte- 00:08:22.476 [2024-04-26 20:04:06.847687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d0200 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.476 [2024-04-26 20:04:06.847711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 #10 NEW cov: 11896 ft: 12754 corp: 6/49b lim: 40 exec/s: 0 rss: 69Mb L: 10/11 MS: 1 CMP- DE: "\002\000"- 00:08:22.476 [2024-04-26 20:04:06.887792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d02 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.476 [2024-04-26 20:04:06.887816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.476 #11 NEW cov: 11896 ft: 12879 corp: 7/61b lim: 40 exec/s: 0 rss: 69Mb L: 12/12 MS: 1 PersAutoDict- DE: "\002\000"- 00:08:22.734 [2024-04-26 20:04:06.928062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:06.928090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 [2024-04-26 20:04:06.928161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9d9d2929 cdw11:29292929 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:06.928175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.734 #12 NEW cov: 11896 ft: 13692 corp: 8/82b lim: 40 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:22.734 [2024-04-26 20:04:06.968041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d5902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:06.968066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #13 NEW cov: 11896 ft: 13729 corp: 9/94b lim: 40 exec/s: 0 rss: 70Mb L: 12/21 MS: 1 ChangeByte- 00:08:22.734 [2024-04-26 20:04:07.008144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9d480000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:07.008168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #17 NEW cov: 11896 ft: 13758 corp: 10/103b lim: 40 exec/s: 0 rss: 70Mb L: 9/21 MS: 4 ChangeBit-ChangeBit-CrossOver-CMP- DE: "H\000\000\000\000\000\000\000"- 00:08:22.734 [2024-04-26 20:04:07.048304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d79 cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:07.048327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #18 NEW cov: 11896 ft: 13809 corp: 11/113b lim: 40 exec/s: 0 rss: 70Mb L: 10/21 MS: 1 ChangeByte- 00:08:22.734 [2024-04-26 20:04:07.088384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:07.088407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #19 NEW cov: 11896 ft: 13835 corp: 12/124b lim: 40 exec/s: 0 rss: 70Mb L: 11/21 MS: 1 CrossOver- 00:08:22.734 [2024-04-26 20:04:07.128512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d02009d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:07.128537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.734 #20 NEW cov: 11896 ft: 13934 corp: 13/137b lim: 40 exec/s: 0 rss: 70Mb L: 13/21 MS: 1 PersAutoDict- DE: "\002\000"- 00:08:22.734 [2024-04-26 20:04:07.168629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:569d4800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.734 [2024-04-26 20:04:07.168653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.992 #21 NEW cov: 11896 ft: 14019 corp: 14/147b lim: 40 exec/s: 0 rss: 70Mb L: 10/21 MS: 1 InsertByte- 00:08:22.992 [2024-04-26 20:04:07.208691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d9d00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.992 [2024-04-26 20:04:07.208715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.992 #22 NEW cov: 11896 ft: 14096 corp: 15/157b lim: 40 exec/s: 0 rss: 70Mb L: 10/21 MS: 1 CrossOver- 00:08:22.992 [2024-04-26 20:04:07.248822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9d480000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.992 [2024-04-26 20:04:07.248848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.992 #23 NEW cov: 11896 ft: 14122 corp: 16/166b lim: 40 exec/s: 0 rss: 70Mb L: 9/21 MS: 1 CopyPart- 00:08:22.992 [2024-04-26 20:04:07.289107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d4800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-04-26 20:04:07.289132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 [2024-04-26 20:04:07.289186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00009d3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-04-26 20:04:07.289200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:22.993 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:22.993 #24 NEW cov: 11919 ft: 14168 corp: 17/183b lim: 40 exec/s: 0 rss: 70Mb L: 17/21 MS: 1 PersAutoDict- DE: "H\000\000\000\000\000\000\000"- 00:08:22.993 [2024-04-26 20:04:07.339121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:636262be SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-04-26 20:04:07.339145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 #25 NEW cov: 11919 ft: 14207 corp: 18/192b lim: 40 exec/s: 0 rss: 70Mb L: 9/21 MS: 1 ChangeBinInt- 00:08:22.993 [2024-04-26 20:04:07.379219] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:00009d00 cdw11:48000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-04-26 20:04:07.379242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:22.993 #26 NEW cov: 11919 ft: 14226 corp: 19/201b lim: 40 exec/s: 26 rss: 70Mb L: 9/21 MS: 1 ShuffleBytes- 00:08:22.993 [2024-04-26 20:04:07.419322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d1d9d cdw11:9d02009d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:22.993 [2024-04-26 20:04:07.419346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.252 #27 NEW cov: 11919 ft: 14266 corp: 20/214b lim: 40 exec/s: 27 rss: 70Mb L: 13/21 MS: 1 ChangeBit- 00:08:23.253 [2024-04-26 20:04:07.459634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d244800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.459658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 [2024-04-26 20:04:07.459712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00009d3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.459725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.253 #28 NEW cov: 11919 ft: 14276 corp: 21/231b lim: 40 exec/s: 28 rss: 70Mb L: 17/21 MS: 1 ChangeByte- 00:08:23.253 [2024-04-26 20:04:07.509743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9d9d9d9d cdw11:9d244800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.509767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 [2024-04-26 20:04:07.509839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00009d3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.509853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.253 #29 NEW cov: 11919 ft: 14329 corp: 22/248b lim: 40 exec/s: 29 rss: 70Mb L: 17/21 MS: 1 CopyPart- 00:08:23.253 [2024-04-26 20:04:07.549706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d79 cdw11:9d9d799d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.549733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 #30 NEW cov: 11919 ft: 14386 corp: 23/258b lim: 40 exec/s: 30 rss: 71Mb L: 10/21 MS: 1 CopyPart- 00:08:23.253 [2024-04-26 20:04:07.589841] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.589864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 #31 NEW cov: 11919 ft: 14397 corp: 24/269b lim: 40 exec/s: 31 rss: 71Mb L: 11/21 MS: 1 ChangeByte- 00:08:23.253 [2024-04-26 20:04:07.629939] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d9d3b SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.629963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 #32 NEW cov: 11919 ft: 14427 corp: 25/278b lim: 40 exec/s: 32 rss: 71Mb L: 9/21 MS: 1 ShuffleBytes- 00:08:23.253 [2024-04-26 20:04:07.670102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:1d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.253 [2024-04-26 20:04:07.670126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.253 #33 NEW cov: 11919 ft: 14471 corp: 26/288b lim: 40 exec/s: 33 rss: 71Mb L: 10/21 MS: 1 ChangeBit- 00:08:23.512 [2024-04-26 20:04:07.710165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.710189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 #34 NEW cov: 11919 ft: 14522 corp: 27/298b lim: 40 exec/s: 34 rss: 71Mb L: 10/21 MS: 1 ShuffleBytes- 00:08:23.512 [2024-04-26 20:04:07.740578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d79 cdw11:9d4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.740601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-04-26 20:04:07.740670] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:4a4a4a4a cdw11:4a4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.740684] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.512 [2024-04-26 20:04:07.740738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:4a4a4a4a cdw11:4a4a4a4a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.740751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:23.512 #35 NEW cov: 11919 ft: 14769 corp: 28/327b lim: 40 exec/s: 35 rss: 71Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:23.512 [2024-04-26 20:04:07.780527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d5902 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.780551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.512 [2024-04-26 20:04:07.780622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.780635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.512 #36 NEW cov: 11919 ft: 14774 corp: 29/347b lim: 40 exec/s: 36 rss: 71Mb L: 20/29 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\001"- 00:08:23.512 [2024-04-26 20:04:07.830567] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ac000048 cdw11:000000ac SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.512 [2024-04-26 20:04:07.830591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.513 #41 NEW cov: 11919 ft: 14784 corp: 30/355b lim: 40 exec/s: 41 rss: 71Mb L: 8/29 MS: 5 EraseBytes-CrossOver-ChangeByte-CopyPart-CopyPart- 00:08:23.513 [2024-04-26 20:04:07.870675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d1d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.513 [2024-04-26 20:04:07.870699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.513 #47 NEW cov: 11919 ft: 14801 corp: 31/365b lim: 40 exec/s: 47 rss: 71Mb L: 10/29 MS: 1 ShuffleBytes- 00:08:23.513 [2024-04-26 20:04:07.910961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9dff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.513 [2024-04-26 20:04:07.910985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.513 [2024-04-26 20:04:07.911038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffff9d cdw11:9d9d9d81 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.513 [2024-04-26 20:04:07.911052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.513 #48 NEW cov: 11919 ft: 14806 corp: 32/384b lim: 40 exec/s: 48 rss: 72Mb L: 19/29 MS: 1 InsertRepeatedBytes- 00:08:23.513 [2024-04-26 20:04:07.951091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d244800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.513 [2024-04-26 20:04:07.951115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.513 [2024-04-26 20:04:07.951173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:05000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.513 [2024-04-26 20:04:07.951186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 #49 NEW cov: 11919 ft: 14863 corp: 33/403b lim: 40 exec/s: 49 rss: 72Mb L: 19/29 MS: 1 CMP- DE: "\005\000"- 00:08:23.772 [2024-04-26 20:04:07.991008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:1d9d959d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:07.991034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 #50 NEW cov: 11919 ft: 14868 corp: 34/413b lim: 40 exec/s: 50 rss: 72Mb L: 10/29 MS: 1 ChangeBit- 00:08:23.772 [2024-04-26 20:04:08.031128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.031152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 #51 NEW cov: 11919 ft: 14907 corp: 35/425b lim: 40 exec/s: 51 rss: 72Mb L: 12/29 MS: 1 CMP- DE: "\000\000\001\000"- 00:08:23.772 [2024-04-26 20:04:08.071390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d244800 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.071414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 [2024-04-26 20:04:08.071468] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:000000f9 cdw11:f9f90000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.071484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 [2024-04-26 20:04:08.111508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d244802 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.111532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 [2024-04-26 20:04:08.111584] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00f9f9f9 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.111598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.772 #53 NEW cov: 11919 ft: 14925 corp: 36/447b lim: 40 exec/s: 53 rss: 72Mb L: 22/29 MS: 2 InsertRepeatedBytes-PersAutoDict- DE: "\002\000"- 00:08:23.772 [2024-04-26 20:04:08.151479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d02 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.151504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 #54 NEW cov: 11919 ft: 15006 corp: 37/459b lim: 40 exec/s: 54 rss: 72Mb L: 12/29 MS: 1 ChangeByte- 00:08:23.772 [2024-04-26 20:04:08.191754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d79 cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.772 [2024-04-26 20:04:08.191777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:23.772 [2024-04-26 20:04:08.191832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:9d799d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:23.773 [2024-04-26 20:04:08.191846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:23.773 #55 NEW cov: 11919 ft: 15015 corp: 38/475b lim: 40 exec/s: 55 rss: 72Mb L: 16/29 MS: 1 CopyPart- 00:08:24.032 [2024-04-26 20:04:08.231722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369d9d cdw11:9d9d0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.032 [2024-04-26 20:04:08.231746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.032 #56 NEW cov: 11919 ft: 15019 corp: 39/487b lim: 40 exec/s: 56 rss: 72Mb L: 12/29 MS: 1 ChangeByte- 00:08:24.032 [2024-04-26 20:04:08.271837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d79 cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.032 [2024-04-26 20:04:08.271860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.032 #57 NEW cov: 11919 ft: 15020 corp: 40/501b lim: 40 exec/s: 57 rss: 72Mb L: 14/29 MS: 1 PersAutoDict- DE: "\000\000\001\000"- 00:08:24.032 [2024-04-26 20:04:08.312003] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:369d9d9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.032 [2024-04-26 20:04:08.312029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.032 #58 NEW cov: 11919 ft: 15027 corp: 41/512b lim: 40 exec/s: 58 rss: 72Mb L: 11/29 MS: 1 CopyPart- 00:08:24.032 [2024-04-26 20:04:08.352149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:9d360a9d cdw11:9d9d9d9d SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.032 [2024-04-26 20:04:08.352175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.032 #59 NEW cov: 11919 ft: 15037 corp: 42/520b lim: 40 exec/s: 59 rss: 72Mb L: 8/29 MS: 1 ShuffleBytes- 00:08:24.032 [2024-04-26 20:04:08.392192] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0a369dbd cdw11:636262be SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:24.032 [2024-04-26 20:04:08.392219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.032 #60 NEW cov: 11919 ft: 15059 corp: 43/529b lim: 40 exec/s: 30 rss: 72Mb L: 9/29 MS: 1 ChangeBit- 00:08:24.032 #60 DONE cov: 11919 ft: 15059 corp: 43/529b lim: 40 exec/s: 30 rss: 72Mb 00:08:24.032 ###### Recommended dictionary. ###### 00:08:24.032 "\002\000" # Uses: 4 00:08:24.032 "H\000\000\000\000\000\000\000" # Uses: 1 00:08:24.032 "\000\000\000\000\000\000\000\001" # Uses: 0 00:08:24.032 "\005\000" # Uses: 0 00:08:24.032 "\000\000\001\000" # Uses: 1 00:08:24.032 ###### End of recommended dictionary. ###### 00:08:24.032 Done 60 runs in 2 second(s) 00:08:24.292 20:04:08 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_12.conf /var/tmp/suppress_nvmf_fuzz 00:08:24.292 20:04:08 -- ../common.sh@72 -- # (( i++ )) 00:08:24.292 20:04:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:24.292 20:04:08 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:08:24.292 20:04:08 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:08:24.292 20:04:08 -- nvmf/run.sh@24 -- # local timen=1 00:08:24.292 20:04:08 -- nvmf/run.sh@25 -- # local core=0x1 00:08:24.292 20:04:08 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.292 20:04:08 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:08:24.292 20:04:08 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:24.292 20:04:08 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:24.292 20:04:08 -- nvmf/run.sh@34 -- # printf %02d 13 00:08:24.292 20:04:08 -- nvmf/run.sh@34 -- # port=4413 00:08:24.292 20:04:08 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.292 20:04:08 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:08:24.292 20:04:08 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:24.292 20:04:08 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:24.292 20:04:08 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:24.292 20:04:08 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 00:08:24.292 [2024-04-26 20:04:08.599299] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:24.292 [2024-04-26 20:04:08.599393] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625702 ] 00:08:24.292 EAL: No free 2048 kB hugepages reported on node 1 00:08:24.552 [2024-04-26 20:04:08.799178] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.552 [2024-04-26 20:04:08.869639] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.552 [2024-04-26 20:04:08.928789] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:24.552 [2024-04-26 20:04:08.945023] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:08:24.552 INFO: Running with entropic power schedule (0xFF, 100). 00:08:24.552 INFO: Seed: 1031894169 00:08:24.552 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:24.552 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:24.552 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:08:24.552 INFO: A corpus is not provided, starting from an empty corpus 00:08:24.552 #2 INITED exec/s: 0 rss: 63Mb 00:08:24.552 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:24.552 This may also happen if the target rejected all inputs we tried so far 00:08:24.811 [2024-04-26 20:04:08.999877] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.811 [2024-04-26 20:04:08.999931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:24.811 [2024-04-26 20:04:08.999967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.811 [2024-04-26 20:04:08.999983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:24.811 [2024-04-26 20:04:09.000013] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:24.811 [2024-04-26 20:04:09.000029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.071 NEW_FUNC[1/670]: 0x494210 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:08:25.071 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:25.071 #16 NEW cov: 11663 ft: 11664 corp: 2/27b lim: 40 exec/s: 0 rss: 69Mb L: 26/26 MS: 4 CopyPart-InsertByte-InsertByte-InsertRepeatedBytes- 00:08:25.071 [2024-04-26 20:04:09.340706] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.340750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.071 [2024-04-26 20:04:09.340801] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.340820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.071 [2024-04-26 20:04:09.340852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:54000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.340869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.071 #17 NEW cov: 11793 ft: 12219 corp: 3/54b lim: 40 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 InsertByte- 00:08:25.071 [2024-04-26 20:04:09.410863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.410910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.071 [2024-04-26 20:04:09.410946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.410978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.071 [2024-04-26 20:04:09.411009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.411025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.071 #18 NEW cov: 11799 ft: 12383 corp: 4/81b lim: 40 exec/s: 0 rss: 69Mb L: 27/27 MS: 1 CopyPart- 00:08:25.071 [2024-04-26 20:04:09.480880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.480930] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.071 [2024-04-26 20:04:09.480969] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.071 [2024-04-26 20:04:09.480985] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.330 #19 NEW cov: 11884 ft: 12907 corp: 5/98b lim: 40 exec/s: 0 rss: 69Mb L: 17/27 MS: 1 InsertRepeatedBytes- 00:08:25.330 [2024-04-26 20:04:09.541067] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:eeeeeeee cdw11:eeeeeeee SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.330 [2024-04-26 20:04:09.541098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.330 [2024-04-26 20:04:09.541147] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:eeeeeeee cdw11:eeeeeeee SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.330 [2024-04-26 20:04:09.541163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.330 #20 NEW cov: 11884 ft: 13018 corp: 6/118b lim: 40 exec/s: 0 rss: 69Mb L: 20/27 MS: 1 InsertRepeatedBytes- 00:08:25.330 [2024-04-26 20:04:09.591240] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.330 [2024-04-26 20:04:09.591269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.330 [2024-04-26 20:04:09.591318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.591333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.591364] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.591379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.331 #21 NEW cov: 11884 ft: 13134 corp: 7/146b lim: 40 exec/s: 0 rss: 70Mb L: 28/28 MS: 1 InsertByte- 00:08:25.331 [2024-04-26 20:04:09.661569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000063 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.661599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.661634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.661649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.661680] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:63000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.661695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.661724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.661739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.661769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:0015a50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.661784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:25.331 #22 NEW cov: 11884 ft: 13697 corp: 8/186b lim: 40 exec/s: 0 rss: 70Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:25.331 [2024-04-26 20:04:09.721588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a636363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.721619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.721653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63636300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.721669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.721699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.721714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.331 #23 NEW cov: 11884 ft: 13767 corp: 9/213b lim: 40 exec/s: 0 rss: 70Mb L: 27/40 MS: 1 CrossOver- 00:08:25.331 [2024-04-26 20:04:09.771754] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00400000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.771785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.771820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.771837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.331 [2024-04-26 20:04:09.771867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.331 [2024-04-26 20:04:09.771891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.590 #24 NEW cov: 11884 ft: 13896 corp: 10/240b lim: 40 exec/s: 0 rss: 70Mb L: 27/40 MS: 1 ChangeBit- 00:08:25.590 [2024-04-26 20:04:09.821833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a636363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.821863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.821920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63636300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.821936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.821967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.821983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.590 #25 NEW cov: 11884 ft: 13951 corp: 11/267b lim: 40 exec/s: 0 rss: 70Mb L: 27/40 MS: 1 ChangeBinInt- 00:08:25.590 [2024-04-26 20:04:09.892049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00002000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.892079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.892113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.892132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.892162] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.892177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.590 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:25.590 #26 NEW cov: 11907 ft: 14021 corp: 12/295b lim: 40 exec/s: 0 rss: 70Mb L: 28/40 MS: 1 ChangeBit- 00:08:25.590 [2024-04-26 20:04:09.962205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.962235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.962284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.962300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.590 [2024-04-26 20:04:09.962330] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:09.962345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.590 #27 NEW cov: 11907 ft: 14044 corp: 13/323b lim: 40 exec/s: 27 rss: 70Mb L: 28/40 MS: 1 ShuffleBytes- 00:08:25.590 [2024-04-26 20:04:10.012236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:8a000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.590 [2024-04-26 20:04:10.012267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 #30 NEW cov: 11907 ft: 14407 corp: 14/334b lim: 40 exec/s: 30 rss: 70Mb L: 11/40 MS: 3 ShuffleBytes-ChangeBit-InsertRepeatedBytes- 00:08:25.849 [2024-04-26 20:04:10.072488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.072523] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-04-26 20:04:10.072576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:000000ff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.072592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 #31 NEW cov: 11907 ft: 14453 corp: 15/354b lim: 40 exec/s: 31 rss: 70Mb L: 20/40 MS: 1 CrossOver- 00:08:25.849 [2024-04-26 20:04:10.142730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.142762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-04-26 20:04:10.142796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.142812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 [2024-04-26 20:04:10.142842] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.142861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.849 #32 NEW cov: 11907 ft: 14468 corp: 16/385b lim: 40 exec/s: 32 rss: 70Mb L: 31/40 MS: 1 CrossOver- 00:08:25.849 [2024-04-26 20:04:10.212853] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a636363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.212888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:25.849 [2024-04-26 20:04:10.212938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.212954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:25.849 [2024-04-26 20:04:10.212985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.213000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:25.849 #33 NEW cov: 11907 ft: 14480 corp: 17/412b lim: 40 exec/s: 33 rss: 70Mb L: 27/40 MS: 1 CopyPart- 00:08:25.849 [2024-04-26 20:04:10.262930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:25.849 [2024-04-26 20:04:10.262960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.108 #34 NEW cov: 11907 ft: 14545 corp: 18/426b lim: 40 exec/s: 34 rss: 70Mb L: 14/40 MS: 1 EraseBytes- 00:08:26.108 [2024-04-26 20:04:10.333207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.333239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.333273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.333288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.333319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.333334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.108 #35 NEW cov: 11907 ft: 14581 corp: 19/457b lim: 40 exec/s: 35 rss: 71Mb L: 31/40 MS: 1 CopyPart- 00:08:26.108 [2024-04-26 20:04:10.403407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a636363 cdw11:63630000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.403437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.403471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00006363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.403487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.403518] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:63636300 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.403533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.108 #36 NEW cov: 11907 ft: 14643 corp: 20/488b lim: 40 exec/s: 36 rss: 71Mb L: 31/40 MS: 1 CrossOver- 00:08:26.108 [2024-04-26 20:04:10.473538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.473569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.473604] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:542fa50a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.473620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.108 #37 NEW cov: 11907 ft: 14676 corp: 21/504b lim: 40 exec/s: 37 rss: 71Mb L: 16/40 MS: 1 EraseBytes- 00:08:26.108 [2024-04-26 20:04:10.523591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.523620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.108 [2024-04-26 20:04:10.523669] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.108 [2024-04-26 20:04:10.523685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 #38 NEW cov: 11907 ft: 14709 corp: 22/524b lim: 40 exec/s: 38 rss: 71Mb L: 20/40 MS: 1 EraseBytes- 00:08:26.368 [2024-04-26 20:04:10.573831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.573861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.573917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000035 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.573933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.573964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.573980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.368 #39 NEW cov: 11907 ft: 14732 corp: 23/555b lim: 40 exec/s: 39 rss: 71Mb L: 31/40 MS: 1 ChangeByte- 00:08:26.368 [2024-04-26 20:04:10.644046] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.644076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.644110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.644126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.644156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:20000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.644171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.368 #40 NEW cov: 11907 ft: 14744 corp: 24/586b lim: 40 exec/s: 40 rss: 71Mb L: 31/40 MS: 1 ChangeBit- 00:08:26.368 [2024-04-26 20:04:10.694161] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.694195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.694244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.694260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.694290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.694305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.694335] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.694351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.368 #41 NEW cov: 11907 ft: 14816 corp: 25/621b lim: 40 exec/s: 41 rss: 71Mb L: 35/40 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:26.368 [2024-04-26 20:04:10.754271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00002000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.754301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.754350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00200000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.754366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.754396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000104 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.754412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.368 #42 NEW cov: 11907 ft: 14885 corp: 26/652b lim: 40 exec/s: 42 rss: 71Mb L: 31/40 MS: 1 ChangeBinInt- 00:08:26.368 [2024-04-26 20:04:10.804469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00010000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.804498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.804548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:f2000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.804564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.804594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.804609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.368 [2024-04-26 20:04:10.804639] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.368 [2024-04-26 20:04:10.804654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.628 #43 NEW cov: 11907 ft: 14898 corp: 27/687b lim: 40 exec/s: 43 rss: 71Mb L: 35/40 MS: 1 ChangeByte- 00:08:26.628 [2024-04-26 20:04:10.874693] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.874725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.874759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.874775] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.874805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.874820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.874850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:09090909 cdw11:09090909 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.874865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:26.628 #44 NEW cov: 11907 ft: 14910 corp: 28/720b lim: 40 exec/s: 44 rss: 71Mb L: 33/40 MS: 1 InsertRepeatedBytes- 00:08:26.628 [2024-04-26 20:04:10.934715] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.934745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.934793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.934808] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.628 #45 NEW cov: 11907 ft: 14930 corp: 29/737b lim: 40 exec/s: 45 rss: 71Mb L: 17/40 MS: 1 CopyPart- 00:08:26.628 [2024-04-26 20:04:10.984964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:0a636363 cdw11:63636363 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.984996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.985048] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:63636363 cdw11:63636300 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.985066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:26.628 [2024-04-26 20:04:10.985099] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:26.628 [2024-04-26 20:04:10.985116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:26.628 #46 NEW cov: 11907 ft: 14950 corp: 30/768b lim: 40 exec/s: 23 rss: 71Mb L: 31/40 MS: 1 InsertRepeatedBytes- 00:08:26.628 #46 DONE cov: 11907 ft: 14950 corp: 30/768b lim: 40 exec/s: 23 rss: 71Mb 00:08:26.628 ###### Recommended dictionary. ###### 00:08:26.628 "\001\000\000\000\000\000\000\000" # Uses: 0 00:08:26.628 ###### End of recommended dictionary. ###### 00:08:26.628 Done 46 runs in 2 second(s) 00:08:26.888 20:04:11 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_13.conf /var/tmp/suppress_nvmf_fuzz 00:08:26.888 20:04:11 -- ../common.sh@72 -- # (( i++ )) 00:08:26.888 20:04:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:26.888 20:04:11 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:26.888 20:04:11 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:26.888 20:04:11 -- nvmf/run.sh@24 -- # local timen=1 00:08:26.888 20:04:11 -- nvmf/run.sh@25 -- # local core=0x1 00:08:26.888 20:04:11 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.888 20:04:11 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:26.888 20:04:11 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:26.888 20:04:11 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:26.888 20:04:11 -- nvmf/run.sh@34 -- # printf %02d 14 00:08:26.888 20:04:11 -- nvmf/run.sh@34 -- # port=4414 00:08:26.888 20:04:11 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:26.888 20:04:11 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:26.888 20:04:11 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:26.888 20:04:11 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:26.888 20:04:11 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:26.888 20:04:11 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 00:08:26.888 [2024-04-26 20:04:11.195537] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:26.888 [2024-04-26 20:04:11.195610] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626055 ] 00:08:26.888 EAL: No free 2048 kB hugepages reported on node 1 00:08:27.147 [2024-04-26 20:04:11.391924] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.147 [2024-04-26 20:04:11.463183] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.147 [2024-04-26 20:04:11.522311] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:27.147 [2024-04-26 20:04:11.538532] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:27.147 INFO: Running with entropic power schedule (0xFF, 100). 00:08:27.147 INFO: Seed: 3624900913 00:08:27.147 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:27.147 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:27.147 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:27.147 INFO: A corpus is not provided, starting from an empty corpus 00:08:27.147 #2 INITED exec/s: 0 rss: 62Mb 00:08:27.147 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:27.147 This may also happen if the target rejected all inputs we tried so far 00:08:27.147 [2024-04-26 20:04:11.583292] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.147 [2024-04-26 20:04:11.583327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.665 NEW_FUNC[1/669]: 0x495dd0 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:27.665 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:27.665 #10 NEW cov: 11630 ft: 11658 corp: 2/11b lim: 35 exec/s: 0 rss: 69Mb L: 10/10 MS: 3 InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:27.665 [2024-04-26 20:04:11.946130] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.665 [2024-04-26 20:04:11.946177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.665 NEW_FUNC[1/2]: 0x19bf060 in reactor_post_process_lw_thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:864 00:08:27.665 NEW_FUNC[2/2]: 0x19c2cb0 in _reactor_run /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:894 00:08:27.665 #11 NEW cov: 11794 ft: 12347 corp: 3/21b lim: 35 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ShuffleBytes- 00:08:27.665 [2024-04-26 20:04:12.006255] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.665 [2024-04-26 20:04:12.006285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.665 #12 NEW cov: 11800 ft: 12568 corp: 4/31b lim: 35 exec/s: 0 rss: 69Mb L: 10/10 MS: 1 ChangeBit- 00:08:27.665 [2024-04-26 20:04:12.056806] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.665 [2024-04-26 20:04:12.056836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.665 [2024-04-26 20:04:12.056933] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.665 [2024-04-26 20:04:12.056951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.665 #13 NEW cov: 11885 ft: 13460 corp: 5/50b lim: 35 exec/s: 0 rss: 70Mb L: 19/19 MS: 1 CrossOver- 00:08:27.925 [2024-04-26 20:04:12.116590] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.116619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 #14 NEW cov: 11885 ft: 13544 corp: 6/60b lim: 35 exec/s: 0 rss: 70Mb L: 10/19 MS: 1 CopyPart- 00:08:27.925 [2024-04-26 20:04:12.177112] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.177141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 [2024-04-26 20:04:12.177237] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.177254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 #15 NEW cov: 11885 ft: 13582 corp: 7/79b lim: 35 exec/s: 0 rss: 70Mb L: 19/19 MS: 1 CopyPart- 00:08:27.925 [2024-04-26 20:04:12.236990] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.237019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 #16 NEW cov: 11885 ft: 13633 corp: 8/89b lim: 35 exec/s: 0 rss: 70Mb L: 10/19 MS: 1 CrossOver- 00:08:27.925 [2024-04-26 20:04:12.287180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.287209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 #17 NEW cov: 11885 ft: 13757 corp: 9/99b lim: 35 exec/s: 0 rss: 70Mb L: 10/19 MS: 1 ChangeBit- 00:08:27.925 [2024-04-26 20:04:12.348019] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.348046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:27.925 [2024-04-26 20:04:12.348134] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.348154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:27.925 [2024-04-26 20:04:12.348239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.925 [2024-04-26 20:04:12.348256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.184 #24 NEW cov: 11885 ft: 13996 corp: 10/121b lim: 35 exec/s: 0 rss: 70Mb L: 22/22 MS: 2 EraseBytes-InsertRepeatedBytes- 00:08:28.184 [2024-04-26 20:04:12.397520] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.397550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 #25 NEW cov: 11885 ft: 14051 corp: 11/131b lim: 35 exec/s: 0 rss: 70Mb L: 10/22 MS: 1 ChangeByte- 00:08:28.184 [2024-04-26 20:04:12.448041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.448066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 [2024-04-26 20:04:12.448161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.448177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:28.184 #26 NEW cov: 11908 ft: 14058 corp: 12/149b lim: 35 exec/s: 0 rss: 70Mb L: 18/22 MS: 1 CrossOver- 00:08:28.184 [2024-04-26 20:04:12.507948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.507974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 #30 NEW cov: 11908 ft: 14130 corp: 13/162b lim: 35 exec/s: 0 rss: 70Mb L: 13/22 MS: 4 InsertByte-ChangeBit-InsertByte-CrossOver- 00:08:28.184 [2024-04-26 20:04:12.558412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000b1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.558438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.184 [2024-04-26 20:04:12.558531] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.558546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.184 #31 NEW cov: 11908 ft: 14159 corp: 14/181b lim: 35 exec/s: 31 rss: 70Mb L: 19/22 MS: 1 ChangeByte- 00:08:28.184 [2024-04-26 20:04:12.608339] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.184 [2024-04-26 20:04:12.608365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 #32 NEW cov: 11908 ft: 14184 corp: 15/192b lim: 35 exec/s: 32 rss: 70Mb L: 11/22 MS: 1 CrossOver- 00:08:28.444 [2024-04-26 20:04:12.658766] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.658794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 [2024-04-26 20:04:12.658883] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.658897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.444 #33 NEW cov: 11908 ft: 14196 corp: 16/210b lim: 35 exec/s: 33 rss: 70Mb L: 18/22 MS: 1 CopyPart- 00:08:28.444 [2024-04-26 20:04:12.718632] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.718660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 #34 NEW cov: 11908 ft: 14239 corp: 17/221b lim: 35 exec/s: 34 rss: 70Mb L: 11/22 MS: 1 InsertByte- 00:08:28.444 [2024-04-26 20:04:12.768714] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.768740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 #35 NEW cov: 11908 ft: 14277 corp: 18/234b lim: 35 exec/s: 35 rss: 71Mb L: 13/22 MS: 1 ChangeByte- 00:08:28.444 [2024-04-26 20:04:12.818906] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000003f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.818931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 #36 NEW cov: 11908 ft: 14337 corp: 19/247b lim: 35 exec/s: 36 rss: 71Mb L: 13/22 MS: 1 ShuffleBytes- 00:08:28.444 [2024-04-26 20:04:12.869850] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.869878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.444 [2024-04-26 20:04:12.869971] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.869988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.444 [2024-04-26 20:04:12.870082] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.444 [2024-04-26 20:04:12.870100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.704 #37 NEW cov: 11908 ft: 14369 corp: 20/269b lim: 35 exec/s: 37 rss: 71Mb L: 22/22 MS: 1 CopyPart- 00:08:28.704 [2024-04-26 20:04:12.929659] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:12.929686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.704 [2024-04-26 20:04:12.929777] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:12.929793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.704 #38 NEW cov: 11908 ft: 14386 corp: 21/287b lim: 35 exec/s: 38 rss: 71Mb L: 18/22 MS: 1 ShuffleBytes- 00:08:28.704 [2024-04-26 20:04:12.990232] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:12.990257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.704 [2024-04-26 20:04:12.990351] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:12.990369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.704 [2024-04-26 20:04:12.990459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:12.990476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.704 #39 NEW cov: 11908 ft: 14455 corp: 22/309b lim: 35 exec/s: 39 rss: 71Mb L: 22/22 MS: 1 ChangeBinInt- 00:08:28.704 [2024-04-26 20:04:13.050108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:13.050134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.704 [2024-04-26 20:04:13.050225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000006e SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.704 [2024-04-26 20:04:13.050241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.704 #44 NEW cov: 11908 ft: 14496 corp: 23/329b lim: 35 exec/s: 44 rss: 71Mb L: 20/22 MS: 5 CopyPart-ShuffleBytes-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:28.704 NEW_FUNC[1/2]: 0x4b7290 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:28.704 NEW_FUNC[2/2]: 0x1174e50 in nvmf_ctrlr_set_features_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1709 00:08:28.704 #45 NEW cov: 11941 ft: 14531 corp: 24/338b lim: 35 exec/s: 45 rss: 71Mb L: 9/22 MS: 1 CrossOver- 00:08:28.963 [2024-04-26 20:04:13.150489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.150517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.150617] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.150633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.963 #46 NEW cov: 11941 ft: 14536 corp: 25/358b lim: 35 exec/s: 46 rss: 71Mb L: 20/22 MS: 1 InsertByte- 00:08:28.963 [2024-04-26 20:04:13.200819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.200845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.200945] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.200964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.201060] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.201081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.963 #47 NEW cov: 11941 ft: 14558 corp: 26/380b lim: 35 exec/s: 47 rss: 71Mb L: 22/22 MS: 1 ChangeBit- 00:08:28.963 [2024-04-26 20:04:13.261389] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.261415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.261505] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.261524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.261624] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.261640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.261733] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.261754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:28.963 #51 NEW cov: 11941 ft: 14852 corp: 27/412b lim: 35 exec/s: 51 rss: 71Mb L: 32/32 MS: 4 EraseBytes-ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:28.963 #52 NEW cov: 11941 ft: 14865 corp: 28/422b lim: 35 exec/s: 52 rss: 72Mb L: 10/32 MS: 1 InsertByte- 00:08:28.963 [2024-04-26 20:04:13.361133] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.361161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:28.963 [2024-04-26 20:04:13.361260] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:28.963 [2024-04-26 20:04:13.361280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:28.963 #53 NEW cov: 11941 ft: 14884 corp: 29/442b lim: 35 exec/s: 53 rss: 72Mb L: 20/32 MS: 1 InsertRepeatedBytes- 00:08:29.222 [2024-04-26 20:04:13.421573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:80000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.222 [2024-04-26 20:04:13.421601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.222 [2024-04-26 20:04:13.421705] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.222 [2024-04-26 20:04:13.421721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.222 #54 NEW cov: 11941 ft: 14917 corp: 30/460b lim: 35 exec/s: 54 rss: 72Mb L: 18/32 MS: 1 CrossOver- 00:08:29.223 [2024-04-26 20:04:13.471510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000023 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.471535] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.223 [2024-04-26 20:04:13.471635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.471651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.223 #55 NEW cov: 11941 ft: 14943 corp: 31/474b lim: 35 exec/s: 55 rss: 72Mb L: 14/32 MS: 1 InsertRepeatedBytes- 00:08:29.223 [2024-04-26 20:04:13.521627] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000b1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.521655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.223 [2024-04-26 20:04:13.521751] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000053 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.521768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.223 #56 NEW cov: 11941 ft: 14955 corp: 32/493b lim: 35 exec/s: 56 rss: 72Mb L: 19/32 MS: 1 ShuffleBytes- 00:08:29.223 [2024-04-26 20:04:13.582179] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.582207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.223 [2024-04-26 20:04:13.582299] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.582321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.223 [2024-04-26 20:04:13.582412] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:800000ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:29.223 [2024-04-26 20:04:13.582430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.223 #57 NEW cov: 11941 ft: 14989 corp: 33/519b lim: 35 exec/s: 28 rss: 72Mb L: 26/32 MS: 1 CopyPart- 00:08:29.223 #57 DONE cov: 11941 ft: 14989 corp: 33/519b lim: 35 exec/s: 28 rss: 72Mb 00:08:29.223 Done 57 runs in 2 second(s) 00:08:29.482 20:04:13 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_14.conf /var/tmp/suppress_nvmf_fuzz 00:08:29.482 20:04:13 -- ../common.sh@72 -- # (( i++ )) 00:08:29.482 20:04:13 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:29.482 20:04:13 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:29.482 20:04:13 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:29.482 20:04:13 -- nvmf/run.sh@24 -- # local timen=1 00:08:29.482 20:04:13 -- nvmf/run.sh@25 -- # local core=0x1 00:08:29.482 20:04:13 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.482 20:04:13 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:29.482 20:04:13 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:29.482 20:04:13 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:29.482 20:04:13 -- nvmf/run.sh@34 -- # printf %02d 15 00:08:29.482 20:04:13 -- nvmf/run.sh@34 -- # port=4415 00:08:29.482 20:04:13 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.482 20:04:13 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:29.482 20:04:13 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:29.482 20:04:13 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:29.482 20:04:13 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:29.482 20:04:13 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 00:08:29.483 [2024-04-26 20:04:13.775643] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:29.483 [2024-04-26 20:04:13.775713] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626417 ] 00:08:29.483 EAL: No free 2048 kB hugepages reported on node 1 00:08:29.742 [2024-04-26 20:04:13.970257] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:29.742 [2024-04-26 20:04:14.040956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:29.742 [2024-04-26 20:04:14.100108] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:29.742 [2024-04-26 20:04:14.116310] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:29.742 INFO: Running with entropic power schedule (0xFF, 100). 00:08:29.742 INFO: Seed: 1905924423 00:08:29.742 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:29.742 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:29.742 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:29.742 INFO: A corpus is not provided, starting from an empty corpus 00:08:29.742 #2 INITED exec/s: 0 rss: 63Mb 00:08:29.742 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:29.742 This may also happen if the target rejected all inputs we tried so far 00:08:29.742 [2024-04-26 20:04:14.165623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.742 [2024-04-26 20:04:14.165655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:29.742 [2024-04-26 20:04:14.165729] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.742 [2024-04-26 20:04:14.165743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:29.742 [2024-04-26 20:04:14.165800] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.742 [2024-04-26 20:04:14.165814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:29.742 [2024-04-26 20:04:14.165869] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:29.742 [2024-04-26 20:04:14.165890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.260 NEW_FUNC[1/669]: 0x497310 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:30.261 NEW_FUNC[2/669]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:30.261 #7 NEW cov: 11636 ft: 11642 corp: 2/33b lim: 35 exec/s: 0 rss: 69Mb L: 32/32 MS: 5 ChangeByte-InsertByte-ChangeBit-EraseBytes-InsertRepeatedBytes- 00:08:30.261 [2024-04-26 20:04:14.476142] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.476188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.476256] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.476276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.261 NEW_FUNC[1/1]: 0x1ce6410 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1310 00:08:30.261 #18 NEW cov: 11775 ft: 12638 corp: 3/50b lim: 35 exec/s: 0 rss: 69Mb L: 17/32 MS: 1 EraseBytes- 00:08:30.261 [2024-04-26 20:04:14.526050] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.526076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.261 #19 NEW cov: 11781 ft: 13230 corp: 4/59b lim: 35 exec/s: 0 rss: 69Mb L: 9/32 MS: 1 EraseBytes- 00:08:30.261 [2024-04-26 20:04:14.566557] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.566584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.566641] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.566654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.566709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.566723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.261 NEW_FUNC[1/1]: 0x4b7290 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:30.261 #26 NEW cov: 11880 ft: 13779 corp: 5/92b lim: 35 exec/s: 0 rss: 69Mb L: 33/33 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:30.261 [2024-04-26 20:04:14.606236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.606261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.261 #27 NEW cov: 11880 ft: 13809 corp: 6/101b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 ShuffleBytes- 00:08:30.261 [2024-04-26 20:04:14.646776] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.646800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.646876] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.646890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.646945] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.646970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.261 #28 NEW cov: 11880 ft: 13899 corp: 7/134b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 CrossOver- 00:08:30.261 [2024-04-26 20:04:14.686609] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.686634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.261 [2024-04-26 20:04:14.686690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.261 [2024-04-26 20:04:14.686704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.520 #29 NEW cov: 11880 ft: 13956 corp: 8/151b lim: 35 exec/s: 0 rss: 70Mb L: 17/33 MS: 1 ChangeBit- 00:08:30.520 [2024-04-26 20:04:14.726611] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.520 [2024-04-26 20:04:14.726638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.520 #30 NEW cov: 11880 ft: 13976 corp: 9/160b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 CopyPart- 00:08:30.520 [2024-04-26 20:04:14.767151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.520 [2024-04-26 20:04:14.767177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.520 [2024-04-26 20:04:14.767236] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.767251] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.767308] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.767321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.521 #31 NEW cov: 11880 ft: 14004 corp: 10/193b lim: 35 exec/s: 0 rss: 70Mb L: 33/33 MS: 1 ChangeBinInt- 00:08:30.521 [2024-04-26 20:04:14.806848] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.806878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.521 #32 NEW cov: 11880 ft: 14068 corp: 11/202b lim: 35 exec/s: 0 rss: 70Mb L: 9/33 MS: 1 ChangeBinInt- 00:08:30.521 [2024-04-26 20:04:14.847327] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.847354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.847425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.847439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.847497] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.847510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.521 #33 NEW cov: 11880 ft: 14072 corp: 12/236b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 CopyPart- 00:08:30.521 [2024-04-26 20:04:14.887193] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.887218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.887291] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.887306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.521 #34 NEW cov: 11880 ft: 14113 corp: 13/254b lim: 35 exec/s: 0 rss: 70Mb L: 18/34 MS: 1 InsertByte- 00:08:30.521 [2024-04-26 20:04:14.927585] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.927610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.927665] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.927679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.521 [2024-04-26 20:04:14.927734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.521 [2024-04-26 20:04:14.927747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.521 #35 NEW cov: 11880 ft: 14170 corp: 14/288b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:30.780 [2024-04-26 20:04:14.977274] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:14.977298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.780 #36 NEW cov: 11880 ft: 14250 corp: 15/297b lim: 35 exec/s: 0 rss: 70Mb L: 9/34 MS: 1 ChangeByte- 00:08:30.780 [2024-04-26 20:04:15.017413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.017438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.780 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:30.780 #37 NEW cov: 11903 ft: 14307 corp: 16/307b lim: 35 exec/s: 0 rss: 71Mb L: 10/34 MS: 1 CrossOver- 00:08:30.780 [2024-04-26 20:04:15.068005] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.068033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.068090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.068103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.068160] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000001ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.068173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.780 #38 NEW cov: 11903 ft: 14335 corp: 17/340b lim: 35 exec/s: 0 rss: 71Mb L: 33/34 MS: 1 ChangeByte- 00:08:30.780 [2024-04-26 20:04:15.108108] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.108133] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.108190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.108204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.108258] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.108271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:30.780 #39 NEW cov: 11903 ft: 14382 corp: 18/373b lim: 35 exec/s: 0 rss: 71Mb L: 33/34 MS: 1 ChangeBit- 00:08:30.780 [2024-04-26 20:04:15.148008] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.148032] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.148105] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.780 [2024-04-26 20:04:15.148119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:30.780 [2024-04-26 20:04:15.148176] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.781 [2024-04-26 20:04:15.148189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:30.781 #40 NEW cov: 11903 ft: 14443 corp: 19/396b lim: 35 exec/s: 40 rss: 71Mb L: 23/34 MS: 1 InsertRepeatedBytes- 00:08:30.781 [2024-04-26 20:04:15.187892] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000000b3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:30.781 [2024-04-26 20:04:15.187916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:30.781 #41 NEW cov: 11903 ft: 14464 corp: 20/405b lim: 35 exec/s: 41 rss: 71Mb L: 9/34 MS: 1 ChangeBit- 00:08:31.040 [2024-04-26 20:04:15.228252] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.228276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.228349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.228363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.228421] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.228435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 #42 NEW cov: 11903 ft: 14527 corp: 21/428b lim: 35 exec/s: 42 rss: 71Mb L: 23/34 MS: 1 CMP- DE: "\002\000"- 00:08:31.040 [2024-04-26 20:04:15.278566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.278590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.278664] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.278679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.278735] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.278748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.040 #43 NEW cov: 11903 ft: 14539 corp: 22/461b lim: 35 exec/s: 43 rss: 71Mb L: 33/34 MS: 1 CopyPart- 00:08:31.040 [2024-04-26 20:04:15.318556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.318580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.318650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.318665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 #44 NEW cov: 11903 ft: 14544 corp: 23/483b lim: 35 exec/s: 44 rss: 71Mb L: 22/34 MS: 1 EraseBytes- 00:08:31.040 [2024-04-26 20:04:15.358797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.358821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.358880] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.358909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.358965] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.358978] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.040 #45 NEW cov: 11903 ft: 14562 corp: 24/517b lim: 35 exec/s: 45 rss: 71Mb L: 34/34 MS: 1 InsertByte- 00:08:31.040 [2024-04-26 20:04:15.398740] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.398763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.398819] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.398833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.398889] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.398922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 #46 NEW cov: 11903 ft: 14579 corp: 25/540b lim: 35 exec/s: 46 rss: 71Mb L: 23/34 MS: 1 CMP- DE: "\000\000\000\034"- 00:08:31.040 [2024-04-26 20:04:15.439041] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.439067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.439139] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.439153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.040 [2024-04-26 20:04:15.439208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.040 [2024-04-26 20:04:15.439222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.040 #47 NEW cov: 11903 ft: 14595 corp: 26/574b lim: 35 exec/s: 47 rss: 71Mb L: 34/34 MS: 1 InsertByte- 00:08:31.300 NEW_FUNC[1/2]: 0x4b6610 in feat_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:332 00:08:31.300 NEW_FUNC[2/2]: 0x1163d60 in nvmf_ctrlr_get_features_interrupt_vector_configuration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1650 00:08:31.300 #48 NEW cov: 11952 ft: 14664 corp: 27/583b lim: 35 exec/s: 48 rss: 72Mb L: 9/34 MS: 1 CopyPart- 00:08:31.300 [2024-04-26 20:04:15.529177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.529202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.300 [2024-04-26 20:04:15.529261] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.529274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.300 #49 NEW cov: 11952 ft: 14704 corp: 28/604b lim: 35 exec/s: 49 rss: 72Mb L: 21/34 MS: 1 CrossOver- 00:08:31.300 [2024-04-26 20:04:15.569417] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.569441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.300 [2024-04-26 20:04:15.569511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.569525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.300 [2024-04-26 20:04:15.569578] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.569592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.300 #50 NEW cov: 11952 ft: 14715 corp: 29/637b lim: 35 exec/s: 50 rss: 72Mb L: 33/34 MS: 1 ShuffleBytes- 00:08:31.300 [2024-04-26 20:04:15.609233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.609257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.300 [2024-04-26 20:04:15.609329] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.609346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.300 #56 NEW cov: 11952 ft: 14729 corp: 30/653b lim: 35 exec/s: 56 rss: 72Mb L: 16/34 MS: 1 CopyPart- 00:08:31.300 [2024-04-26 20:04:15.649215] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.649239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.300 #57 NEW cov: 11952 ft: 14735 corp: 31/665b lim: 35 exec/s: 57 rss: 72Mb L: 12/34 MS: 1 CopyPart- 00:08:31.300 [2024-04-26 20:04:15.689317] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.689341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.300 #58 NEW cov: 11952 ft: 14752 corp: 32/674b lim: 35 exec/s: 58 rss: 72Mb L: 9/34 MS: 1 ShuffleBytes- 00:08:31.300 [2024-04-26 20:04:15.729556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.729580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.300 [2024-04-26 20:04:15.729652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000006c1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.300 [2024-04-26 20:04:15.729666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 #59 NEW cov: 11952 ft: 14765 corp: 33/690b lim: 35 exec/s: 59 rss: 72Mb L: 16/34 MS: 1 CMP- DE: "\001\012\336\301\300\204\225\304"- 00:08:31.560 [2024-04-26 20:04:15.769992] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.770016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.770074] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.770087] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.770144] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.770157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.560 #60 NEW cov: 11952 ft: 14766 corp: 34/724b lim: 35 exec/s: 60 rss: 72Mb L: 34/34 MS: 1 ChangeByte- 00:08:31.560 [2024-04-26 20:04:15.810110] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.810136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.810192] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.810206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.810276] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.810290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.560 #61 NEW cov: 11952 ft: 14841 corp: 35/756b lim: 35 exec/s: 61 rss: 72Mb L: 32/34 MS: 1 EraseBytes- 00:08:31.560 [2024-04-26 20:04:15.850077] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.850106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.850163] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.850177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.850233] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.850246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.560 #62 NEW cov: 11952 ft: 14879 corp: 36/779b lim: 35 exec/s: 62 rss: 72Mb L: 23/34 MS: 1 CrossOver- 00:08:31.560 [2024-04-26 20:04:15.890004] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.890028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 [2024-04-26 20:04:15.890083] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.890097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.560 #63 NEW cov: 11952 ft: 14885 corp: 37/797b lim: 35 exec/s: 63 rss: 72Mb L: 18/34 MS: 1 CMP- DE: "\377\377\377\011"- 00:08:31.560 [2024-04-26 20:04:15.929982] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.930006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 #64 NEW cov: 11952 ft: 14905 corp: 38/806b lim: 35 exec/s: 64 rss: 72Mb L: 9/34 MS: 1 ShuffleBytes- 00:08:31.560 [2024-04-26 20:04:15.970090] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.560 [2024-04-26 20:04:15.970114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.560 #65 NEW cov: 11952 ft: 14911 corp: 39/815b lim: 35 exec/s: 65 rss: 72Mb L: 9/34 MS: 1 ChangeBit- 00:08:31.819 [2024-04-26 20:04:16.010350] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.010375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.010435] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.010450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 #66 NEW cov: 11952 ft: 14925 corp: 40/835b lim: 35 exec/s: 66 rss: 72Mb L: 20/34 MS: 1 EraseBytes- 00:08:31.819 [2024-04-26 20:04:16.050685] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.050711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.050785] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.050800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.050864] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000006da SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.050886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.050942] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000003da SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.050956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.819 #67 NEW cov: 11952 ft: 14978 corp: 41/864b lim: 35 exec/s: 67 rss: 72Mb L: 29/34 MS: 1 InsertRepeatedBytes- 00:08:31.819 [2024-04-26 20:04:16.090817] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.090843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.090918] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.090933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.090987] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.091000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:31.819 #68 NEW cov: 11952 ft: 15004 corp: 42/897b lim: 35 exec/s: 68 rss: 72Mb L: 33/34 MS: 1 ChangeByte- 00:08:31.819 [2024-04-26 20:04:16.130680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.130707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:31.819 [2024-04-26 20:04:16.130765] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000005a3 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:31.819 [2024-04-26 20:04:16.130779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:31.819 #69 NEW cov: 11952 ft: 15010 corp: 43/917b lim: 35 exec/s: 34 rss: 72Mb L: 20/34 MS: 1 PersAutoDict- DE: "\000\000\000\034"- 00:08:31.819 #69 DONE cov: 11952 ft: 15010 corp: 43/917b lim: 35 exec/s: 34 rss: 72Mb 00:08:31.819 ###### Recommended dictionary. ###### 00:08:31.819 "\002\000" # Uses: 0 00:08:31.819 "\000\000\000\034" # Uses: 1 00:08:31.819 "\001\012\336\301\300\204\225\304" # Uses: 0 00:08:31.819 "\377\377\377\011" # Uses: 0 00:08:31.819 ###### End of recommended dictionary. ###### 00:08:31.819 Done 69 runs in 2 second(s) 00:08:32.083 20:04:16 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_15.conf /var/tmp/suppress_nvmf_fuzz 00:08:32.084 20:04:16 -- ../common.sh@72 -- # (( i++ )) 00:08:32.084 20:04:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:32.084 20:04:16 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:32.084 20:04:16 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:32.084 20:04:16 -- nvmf/run.sh@24 -- # local timen=1 00:08:32.084 20:04:16 -- nvmf/run.sh@25 -- # local core=0x1 00:08:32.084 20:04:16 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.084 20:04:16 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:32.084 20:04:16 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:32.084 20:04:16 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:32.084 20:04:16 -- nvmf/run.sh@34 -- # printf %02d 16 00:08:32.084 20:04:16 -- nvmf/run.sh@34 -- # port=4416 00:08:32.084 20:04:16 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.084 20:04:16 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:32.084 20:04:16 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:32.084 20:04:16 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:32.084 20:04:16 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:32.084 20:04:16 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 00:08:32.084 [2024-04-26 20:04:16.334155] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:32.084 [2024-04-26 20:04:16.334240] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626734 ] 00:08:32.084 EAL: No free 2048 kB hugepages reported on node 1 00:08:32.404 [2024-04-26 20:04:16.531524] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.404 [2024-04-26 20:04:16.605303] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.404 [2024-04-26 20:04:16.664441] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:32.404 [2024-04-26 20:04:16.680647] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:32.404 INFO: Running with entropic power schedule (0xFF, 100). 00:08:32.404 INFO: Seed: 175976397 00:08:32.404 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:32.404 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:32.404 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:32.404 INFO: A corpus is not provided, starting from an empty corpus 00:08:32.404 #2 INITED exec/s: 0 rss: 63Mb 00:08:32.404 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:32.404 This may also happen if the target rejected all inputs we tried so far 00:08:32.404 [2024-04-26 20:04:16.725981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.404 [2024-04-26 20:04:16.726011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.404 [2024-04-26 20:04:16.726063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.404 [2024-04-26 20:04:16.726079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.405 [2024-04-26 20:04:16.726132] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.405 [2024-04-26 20:04:16.726147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.664 NEW_FUNC[1/671]: 0x4987c0 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:32.664 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:32.664 #3 NEW cov: 11749 ft: 11750 corp: 2/82b lim: 105 exec/s: 0 rss: 69Mb L: 81/81 MS: 1 InsertRepeatedBytes- 00:08:32.664 [2024-04-26 20:04:17.056783] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.664 [2024-04-26 20:04:17.056822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.664 [2024-04-26 20:04:17.056876] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.664 [2024-04-26 20:04:17.056892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.664 [2024-04-26 20:04:17.056946] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.664 [2024-04-26 20:04:17.056965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.664 #9 NEW cov: 11879 ft: 12266 corp: 3/164b lim: 105 exec/s: 0 rss: 69Mb L: 82/82 MS: 1 InsertByte- 00:08:32.664 [2024-04-26 20:04:17.106636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627433664686891 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.664 [2024-04-26 20:04:17.106667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 #24 NEW cov: 11885 ft: 12875 corp: 4/196b lim: 105 exec/s: 0 rss: 69Mb L: 32/82 MS: 5 ShuffleBytes-CopyPart-ChangeByte-InsertRepeatedBytes-InsertRepeatedBytes- 00:08:32.924 [2024-04-26 20:04:17.146787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.146814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.146852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.146867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.924 #25 NEW cov: 11970 ft: 13477 corp: 5/252b lim: 105 exec/s: 0 rss: 69Mb L: 56/82 MS: 1 CrossOver- 00:08:32.924 [2024-04-26 20:04:17.187147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744069733351423 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.187173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.187240] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.187255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.187306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.187321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.187372] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.187388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:32.924 #34 NEW cov: 11970 ft: 14047 corp: 6/342b lim: 105 exec/s: 0 rss: 69Mb L: 90/90 MS: 4 ChangeByte-ChangeBit-CopyPart-InsertRepeatedBytes- 00:08:32.924 [2024-04-26 20:04:17.227158] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.227185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.227230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.227245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.227297] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.227312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.924 #35 NEW cov: 11970 ft: 14149 corp: 7/424b lim: 105 exec/s: 0 rss: 69Mb L: 82/90 MS: 1 ChangeByte- 00:08:32.924 [2024-04-26 20:04:17.267066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.267092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 #36 NEW cov: 11970 ft: 14203 corp: 8/461b lim: 105 exec/s: 0 rss: 70Mb L: 37/90 MS: 1 EraseBytes- 00:08:32.924 [2024-04-26 20:04:17.317455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.317480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.317538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.317554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.317607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.317621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:32.924 #37 NEW cov: 11970 ft: 14239 corp: 9/533b lim: 105 exec/s: 0 rss: 70Mb L: 72/90 MS: 1 InsertRepeatedBytes- 00:08:32.924 [2024-04-26 20:04:17.357538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.357564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.357620] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23080948090273792 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.357636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:32.924 [2024-04-26 20:04:17.357689] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:32.924 [2024-04-26 20:04:17.357704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.184 #38 NEW cov: 11970 ft: 14323 corp: 10/615b lim: 105 exec/s: 0 rss: 70Mb L: 82/90 MS: 1 ChangeBinInt- 00:08:33.184 [2024-04-26 20:04:17.397424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.397451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 #39 NEW cov: 11970 ft: 14378 corp: 11/652b lim: 105 exec/s: 0 rss: 70Mb L: 37/90 MS: 1 ChangeBit- 00:08:33.184 [2024-04-26 20:04:17.437740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.437767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.437813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.437828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.437883] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.437901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.184 #40 NEW cov: 11970 ft: 14393 corp: 12/725b lim: 105 exec/s: 0 rss: 70Mb L: 73/90 MS: 1 EraseBytes- 00:08:33.184 [2024-04-26 20:04:17.477886] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.477913] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.477959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.477976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.478027] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.478042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.184 #45 NEW cov: 11970 ft: 14463 corp: 13/808b lim: 105 exec/s: 0 rss: 70Mb L: 83/90 MS: 5 CopyPart-ShuffleBytes-ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:33.184 [2024-04-26 20:04:17.518073] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.518101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.518151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23080952385241088 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.518167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.518222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.518238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.184 #46 NEW cov: 11970 ft: 14504 corp: 14/890b lim: 105 exec/s: 0 rss: 70Mb L: 82/90 MS: 1 ChangeBit- 00:08:33.184 [2024-04-26 20:04:17.568131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.568158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.568199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.568214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.568267] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:33 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.568298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.184 #47 NEW cov: 11970 ft: 14587 corp: 15/962b lim: 105 exec/s: 0 rss: 70Mb L: 72/90 MS: 1 ChangeBit- 00:08:33.184 [2024-04-26 20:04:17.608128] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.608158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.184 [2024-04-26 20:04:17.608227] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.184 [2024-04-26 20:04:17.608247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.444 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:33.444 #48 NEW cov: 11993 ft: 14627 corp: 16/1009b lim: 105 exec/s: 0 rss: 70Mb L: 47/90 MS: 1 CrossOver- 00:08:33.444 [2024-04-26 20:04:17.658160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.658186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 #49 NEW cov: 11993 ft: 14652 corp: 17/1039b lim: 105 exec/s: 0 rss: 70Mb L: 30/90 MS: 1 EraseBytes- 00:08:33.444 [2024-04-26 20:04:17.698499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.698525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.698577] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.698592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.698646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.698661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.444 #50 NEW cov: 11993 ft: 14685 corp: 18/1113b lim: 105 exec/s: 50 rss: 70Mb L: 74/90 MS: 1 InsertByte- 00:08:33.444 [2024-04-26 20:04:17.738716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.738742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.738810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23080948090273792 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.738825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.738881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446742974197924095 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.738896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.738950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2583691264 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.738965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.444 #51 NEW cov: 11993 ft: 14718 corp: 19/1199b lim: 105 exec/s: 51 rss: 70Mb L: 86/90 MS: 1 CrossOver- 00:08:33.444 [2024-04-26 20:04:17.778723] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:43 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.778749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.778811] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.778827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.778887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.778902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.444 #52 NEW cov: 11993 ft: 14778 corp: 20/1282b lim: 105 exec/s: 52 rss: 70Mb L: 83/90 MS: 1 ChangeByte- 00:08:33.444 [2024-04-26 20:04:17.818609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.818635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 #53 NEW cov: 11993 ft: 14812 corp: 21/1312b lim: 105 exec/s: 53 rss: 71Mb L: 30/90 MS: 1 ShuffleBytes- 00:08:33.444 [2024-04-26 20:04:17.859103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.859129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.859199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.859214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.859265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.859280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.444 [2024-04-26 20:04:17.859331] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.444 [2024-04-26 20:04:17.859346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.444 #54 NEW cov: 11993 ft: 14851 corp: 22/1404b lim: 105 exec/s: 54 rss: 71Mb L: 92/92 MS: 1 CopyPart- 00:08:33.703 [2024-04-26 20:04:17.899174] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.899201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.899268] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.899283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.899334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.899349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.899401] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.899416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.703 #55 NEW cov: 11993 ft: 14866 corp: 23/1500b lim: 105 exec/s: 55 rss: 71Mb L: 96/96 MS: 1 CrossOver- 00:08:33.703 [2024-04-26 20:04:17.939201] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.939227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.939289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.939309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.939360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.939375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.703 #56 NEW cov: 11993 ft: 14871 corp: 24/1572b lim: 105 exec/s: 56 rss: 71Mb L: 72/96 MS: 1 ShuffleBytes- 00:08:33.703 [2024-04-26 20:04:17.979295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:989855744 len:43 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.979321] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.979380] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.979396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:17.979450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:17.979465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.703 #57 NEW cov: 11993 ft: 14933 corp: 25/1655b lim: 105 exec/s: 57 rss: 71Mb L: 83/96 MS: 1 CMP- DE: "\007\000"- 00:08:33.703 [2024-04-26 20:04:18.019452] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175779584 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.019478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:18.019516] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:90159953477632 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.019531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:18.019583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.019598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.703 #58 NEW cov: 11993 ft: 14941 corp: 26/1738b lim: 105 exec/s: 58 rss: 71Mb L: 83/96 MS: 1 InsertByte- 00:08:33.703 [2024-04-26 20:04:18.059461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.059488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 [2024-04-26 20:04:18.059547] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.059563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.703 #59 NEW cov: 11993 ft: 14950 corp: 27/1794b lim: 105 exec/s: 59 rss: 71Mb L: 56/96 MS: 1 PersAutoDict- DE: "\007\000"- 00:08:33.703 [2024-04-26 20:04:18.099432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.703 [2024-04-26 20:04:18.099458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.703 #60 NEW cov: 11993 ft: 14960 corp: 28/1831b lim: 105 exec/s: 60 rss: 71Mb L: 37/96 MS: 1 ShuffleBytes- 00:08:33.704 [2024-04-26 20:04:18.139698] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.704 [2024-04-26 20:04:18.139724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.704 [2024-04-26 20:04:18.139778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.704 [2024-04-26 20:04:18.139793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.963 #61 NEW cov: 11993 ft: 14966 corp: 29/1874b lim: 105 exec/s: 61 rss: 71Mb L: 43/96 MS: 1 EraseBytes- 00:08:33.963 [2024-04-26 20:04:18.179697] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.179724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 #62 NEW cov: 11993 ft: 15003 corp: 30/1896b lim: 105 exec/s: 62 rss: 72Mb L: 22/96 MS: 1 EraseBytes- 00:08:33.963 [2024-04-26 20:04:18.220165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.220191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.220245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23080948090273792 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.220259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.220310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:72057589742960640 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.220324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.220376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:39424 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.220391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:33.963 #63 NEW cov: 11993 ft: 15013 corp: 31/1984b lim: 105 exec/s: 63 rss: 72Mb L: 88/96 MS: 1 PersAutoDict- DE: "\007\000"- 00:08:33.963 [2024-04-26 20:04:18.260138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.260164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.260210] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.260225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.260276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.260291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.963 #64 NEW cov: 11993 ft: 15021 corp: 32/2056b lim: 105 exec/s: 64 rss: 72Mb L: 72/96 MS: 1 ChangeBit- 00:08:33.963 [2024-04-26 20:04:18.300232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.300259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.300316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.300331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.300385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.300399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.963 #65 NEW cov: 11993 ft: 15065 corp: 33/2129b lim: 105 exec/s: 65 rss: 72Mb L: 73/96 MS: 1 InsertByte- 00:08:33.963 [2024-04-26 20:04:18.340393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.340419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.340465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.340481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:33.963 [2024-04-26 20:04:18.340533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.340547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:33.963 #66 NEW cov: 11993 ft: 15075 corp: 34/2204b lim: 105 exec/s: 66 rss: 72Mb L: 75/96 MS: 1 InsertByte- 00:08:33.963 [2024-04-26 20:04:18.380258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:33.963 [2024-04-26 20:04:18.380285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:33.963 #67 NEW cov: 11993 ft: 15093 corp: 35/2241b lim: 105 exec/s: 67 rss: 72Mb L: 37/96 MS: 1 CrossOver- 00:08:34.223 [2024-04-26 20:04:18.420741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.420767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.420834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.420849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.420904] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.420919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.420984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.421000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.223 #68 NEW cov: 11993 ft: 15197 corp: 36/2345b lim: 105 exec/s: 68 rss: 72Mb L: 104/104 MS: 1 InsertRepeatedBytes- 00:08:34.223 [2024-04-26 20:04:18.470751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175779584 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.470778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.470832] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:90159953477632 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.470847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.470905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.470921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.223 #69 NEW cov: 11993 ft: 15204 corp: 37/2428b lim: 105 exec/s: 69 rss: 72Mb L: 83/104 MS: 1 CopyPart- 00:08:34.223 [2024-04-26 20:04:18.510839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.510867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.510922] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.510937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.510989] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:601295421440 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.511004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.223 #70 NEW cov: 11993 ft: 15284 corp: 38/2498b lim: 105 exec/s: 70 rss: 72Mb L: 70/104 MS: 1 CopyPart- 00:08:34.223 [2024-04-26 20:04:18.551085] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.551111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.551165] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.551179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.551229] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.223 [2024-04-26 20:04:18.551244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.223 [2024-04-26 20:04:18.551295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.551310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.224 #71 NEW cov: 11993 ft: 15359 corp: 39/2598b lim: 105 exec/s: 71 rss: 72Mb L: 100/104 MS: 1 InsertRepeatedBytes- 00:08:34.224 [2024-04-26 20:04:18.591091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.591117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.224 [2024-04-26 20:04:18.591180] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.591195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.224 [2024-04-26 20:04:18.591249] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:10092544 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.591267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.224 #72 NEW cov: 11993 ft: 15360 corp: 40/2669b lim: 105 exec/s: 72 rss: 72Mb L: 71/104 MS: 1 CrossOver- 00:08:34.224 [2024-04-26 20:04:18.631086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.631112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.224 [2024-04-26 20:04:18.631162] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23080948090273792 len:11 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.224 [2024-04-26 20:04:18.631178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.224 #73 NEW cov: 11993 ft: 15362 corp: 41/2729b lim: 105 exec/s: 73 rss: 72Mb L: 60/104 MS: 1 CrossOver- 00:08:34.483 [2024-04-26 20:04:18.671450] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:175767552 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.483 [2024-04-26 20:04:18.671477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.483 [2024-04-26 20:04:18.671540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:23084246625157120 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.483 [2024-04-26 20:04:18.671555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:34.483 [2024-04-26 20:04:18.671606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:7696581394432 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.483 [2024-04-26 20:04:18.671620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:34.483 [2024-04-26 20:04:18.671673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:39425 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.484 [2024-04-26 20:04:18.671687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:34.484 #74 NEW cov: 11993 ft: 15369 corp: 42/2825b lim: 105 exec/s: 74 rss: 72Mb L: 96/104 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:34.484 [2024-04-26 20:04:18.721235] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:3110627433664686891 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:34.484 [2024-04-26 20:04:18.721262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:34.484 #75 NEW cov: 11993 ft: 15406 corp: 43/2852b lim: 105 exec/s: 37 rss: 73Mb L: 27/104 MS: 1 EraseBytes- 00:08:34.484 #75 DONE cov: 11993 ft: 15406 corp: 43/2852b lim: 105 exec/s: 37 rss: 73Mb 00:08:34.484 ###### Recommended dictionary. ###### 00:08:34.484 "\007\000" # Uses: 2 00:08:34.484 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:34.484 ###### End of recommended dictionary. ###### 00:08:34.484 Done 75 runs in 2 second(s) 00:08:34.484 20:04:18 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_16.conf /var/tmp/suppress_nvmf_fuzz 00:08:34.484 20:04:18 -- ../common.sh@72 -- # (( i++ )) 00:08:34.484 20:04:18 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:34.484 20:04:18 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:34.484 20:04:18 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:34.484 20:04:18 -- nvmf/run.sh@24 -- # local timen=1 00:08:34.484 20:04:18 -- nvmf/run.sh@25 -- # local core=0x1 00:08:34.484 20:04:18 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.484 20:04:18 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:34.484 20:04:18 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:34.484 20:04:18 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:34.484 20:04:18 -- nvmf/run.sh@34 -- # printf %02d 17 00:08:34.484 20:04:18 -- nvmf/run.sh@34 -- # port=4417 00:08:34.484 20:04:18 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:34.484 20:04:18 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:34.484 20:04:18 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:34.484 20:04:18 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:34.484 20:04:18 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:34.484 20:04:18 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 00:08:34.743 [2024-04-26 20:04:18.929131] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:34.743 [2024-04-26 20:04:18.929219] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627055 ] 00:08:34.743 EAL: No free 2048 kB hugepages reported on node 1 00:08:34.743 [2024-04-26 20:04:19.123247] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.002 [2024-04-26 20:04:19.195417] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.002 [2024-04-26 20:04:19.254628] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:35.002 [2024-04-26 20:04:19.270835] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:35.002 INFO: Running with entropic power schedule (0xFF, 100). 00:08:35.002 INFO: Seed: 2766967330 00:08:35.002 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:35.002 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:35.002 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:35.002 INFO: A corpus is not provided, starting from an empty corpus 00:08:35.002 #2 INITED exec/s: 0 rss: 63Mb 00:08:35.002 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:35.002 This may also happen if the target rejected all inputs we tried so far 00:08:35.002 [2024-04-26 20:04:19.316044] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.002 [2024-04-26 20:04:19.316075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.261 NEW_FUNC[1/672]: 0x49bb40 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:35.261 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:35.261 #20 NEW cov: 11770 ft: 11771 corp: 2/37b lim: 120 exec/s: 0 rss: 69Mb L: 36/36 MS: 3 CrossOver-CopyPart-InsertRepeatedBytes- 00:08:35.261 [2024-04-26 20:04:19.658272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.261 [2024-04-26 20:04:19.658320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.261 [2024-04-26 20:04:19.658410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.261 [2024-04-26 20:04:19.658434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.261 #30 NEW cov: 11900 ft: 13092 corp: 3/90b lim: 120 exec/s: 0 rss: 69Mb L: 53/53 MS: 5 ChangeByte-CrossOver-CopyPart-InsertByte-InsertRepeatedBytes- 00:08:35.520 [2024-04-26 20:04:19.708226] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.708255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.520 #36 NEW cov: 11906 ft: 13354 corp: 4/126b lim: 120 exec/s: 0 rss: 69Mb L: 36/53 MS: 1 CopyPart- 00:08:35.520 [2024-04-26 20:04:19.769171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1085102592487264015 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.769198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.769265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.769285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.769360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.769378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.769457] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.769475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.520 #37 NEW cov: 11991 ft: 14008 corp: 5/244b lim: 120 exec/s: 0 rss: 69Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:35.520 [2024-04-26 20:04:19.819056] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.819082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.819146] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.819162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.819242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.819261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.520 #38 NEW cov: 11991 ft: 14452 corp: 6/319b lim: 120 exec/s: 0 rss: 69Mb L: 75/118 MS: 1 InsertRepeatedBytes- 00:08:35.520 [2024-04-26 20:04:19.869635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1085102592487264015 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.869661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.869743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.869764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.869840] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.869862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.520 [2024-04-26 20:04:19.869944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.869963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:35.520 #39 NEW cov: 11991 ft: 14511 corp: 7/437b lim: 120 exec/s: 0 rss: 69Mb L: 118/118 MS: 1 ShuffleBytes- 00:08:35.520 [2024-04-26 20:04:19.928788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.520 [2024-04-26 20:04:19.928814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.520 #45 NEW cov: 11991 ft: 14551 corp: 8/473b lim: 120 exec/s: 0 rss: 69Mb L: 36/118 MS: 1 CrossOver- 00:08:35.780 [2024-04-26 20:04:19.989647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:19.989674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.780 [2024-04-26 20:04:19.989740] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:19.989759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:35.780 [2024-04-26 20:04:19.989831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:24576 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:19.989850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:35.780 #46 NEW cov: 11991 ft: 14608 corp: 9/549b lim: 120 exec/s: 0 rss: 69Mb L: 76/118 MS: 1 InsertByte- 00:08:35.780 [2024-04-26 20:04:20.048635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:20.048741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.780 #47 NEW cov: 11991 ft: 14740 corp: 10/585b lim: 120 exec/s: 0 rss: 69Mb L: 36/118 MS: 1 ChangeBinInt- 00:08:35.780 [2024-04-26 20:04:20.109465] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:20.109498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.780 #49 NEW cov: 11991 ft: 14773 corp: 11/620b lim: 120 exec/s: 0 rss: 69Mb L: 35/118 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:35.780 [2024-04-26 20:04:20.159757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:20.159791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.780 #50 NEW cov: 11991 ft: 14817 corp: 12/653b lim: 120 exec/s: 0 rss: 70Mb L: 33/118 MS: 1 EraseBytes- 00:08:35.780 [2024-04-26 20:04:20.220280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:20.220313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:35.780 [2024-04-26 20:04:20.220354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:35.780 [2024-04-26 20:04:20.220372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.046 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:36.046 #51 NEW cov: 12014 ft: 14881 corp: 13/706b lim: 120 exec/s: 0 rss: 70Mb L: 53/118 MS: 1 ChangeByte- 00:08:36.046 [2024-04-26 20:04:20.280098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:168427520 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.280131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.046 #52 NEW cov: 12014 ft: 14912 corp: 14/742b lim: 120 exec/s: 0 rss: 70Mb L: 36/118 MS: 1 ShuffleBytes- 00:08:36.046 [2024-04-26 20:04:20.330893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.330934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.331001] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.331031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.331106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.331123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.046 #53 NEW cov: 12014 ft: 14928 corp: 15/817b lim: 120 exec/s: 53 rss: 70Mb L: 75/118 MS: 1 ChangeBinInt- 00:08:36.046 [2024-04-26 20:04:20.381100] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.381129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.381199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.381216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.381291] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:24576 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.381308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.046 #54 NEW cov: 12014 ft: 15050 corp: 16/893b lim: 120 exec/s: 54 rss: 70Mb L: 76/118 MS: 1 ShuffleBytes- 00:08:36.046 [2024-04-26 20:04:20.441089] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.441114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.441186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.441205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.046 [2024-04-26 20:04:20.441281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.046 [2024-04-26 20:04:20.441298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.046 #55 NEW cov: 12014 ft: 15125 corp: 17/970b lim: 120 exec/s: 55 rss: 70Mb L: 77/118 MS: 1 CMP- DE: "\377\007"- 00:08:36.306 [2024-04-26 20:04:20.492155] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:1085102592487264015 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.492189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.492277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.492293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.492376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.492391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.492479] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.492496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.492576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:1085102592571150095 len:3856 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.492594] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.306 #56 NEW cov: 12014 ft: 15219 corp: 18/1090b lim: 120 exec/s: 56 rss: 70Mb L: 120/120 MS: 1 CopyPart- 00:08:36.306 [2024-04-26 20:04:20.541857] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.541889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.541981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.542002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.542066] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.542083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.542168] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.542188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.306 #57 NEW cov: 12014 ft: 15268 corp: 19/1197b lim: 120 exec/s: 57 rss: 70Mb L: 107/120 MS: 1 CrossOver- 00:08:36.306 [2024-04-26 20:04:20.591420] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.591447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.591511] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.591529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.306 #58 NEW cov: 12014 ft: 15284 corp: 20/1250b lim: 120 exec/s: 58 rss: 70Mb L: 53/120 MS: 1 ChangeByte- 00:08:36.306 [2024-04-26 20:04:20.651197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:13053329408 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.651224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.306 #59 NEW cov: 12014 ft: 15336 corp: 21/1286b lim: 120 exec/s: 59 rss: 70Mb L: 36/120 MS: 1 ChangeBinInt- 00:08:36.306 [2024-04-26 20:04:20.702367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.702393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.702467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.702485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.702531] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.702547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.306 [2024-04-26 20:04:20.702631] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:17627639749423723764 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.306 [2024-04-26 20:04:20.702648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.306 #60 NEW cov: 12014 ft: 15356 corp: 22/1393b lim: 120 exec/s: 60 rss: 70Mb L: 107/120 MS: 1 InsertRepeatedBytes- 00:08:36.566 [2024-04-26 20:04:20.762317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.762344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.762411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.762428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.762517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:96 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.762533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.566 #61 NEW cov: 12014 ft: 15373 corp: 23/1470b lim: 120 exec/s: 61 rss: 70Mb L: 77/120 MS: 1 InsertByte- 00:08:36.566 [2024-04-26 20:04:20.823145] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.823172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.823299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.823316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.823403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.823420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.823501] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.823518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.823602] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:4 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.823622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:36.566 #62 NEW cov: 12014 ft: 15383 corp: 24/1590b lim: 120 exec/s: 62 rss: 70Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:08:36.566 [2024-04-26 20:04:20.882701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.882726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.882789] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.882807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.882893] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.882923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.566 #63 NEW cov: 12014 ft: 15421 corp: 25/1665b lim: 120 exec/s: 63 rss: 70Mb L: 75/120 MS: 1 CrossOver- 00:08:36.566 [2024-04-26 20:04:20.942982] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.943009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.943080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.943098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.943175] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.943191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.566 #64 NEW cov: 12014 ft: 15453 corp: 26/1740b lim: 120 exec/s: 64 rss: 70Mb L: 75/120 MS: 1 ChangeByte- 00:08:36.566 [2024-04-26 20:04:20.992841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.992868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.566 [2024-04-26 20:04:20.992954] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.566 [2024-04-26 20:04:20.992975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.825 #65 NEW cov: 12014 ft: 15461 corp: 27/1793b lim: 120 exec/s: 65 rss: 70Mb L: 53/120 MS: 1 ChangeByte- 00:08:36.825 [2024-04-26 20:04:21.043232] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.043259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.043344] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.043362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.043444] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:24576 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.043463] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.825 #66 NEW cov: 12014 ft: 15471 corp: 28/1869b lim: 120 exec/s: 66 rss: 70Mb L: 76/120 MS: 1 ChangeByte- 00:08:36.825 [2024-04-26 20:04:21.093429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.093455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.093519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.093538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.093613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.093628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.825 #67 NEW cov: 12014 ft: 15487 corp: 29/1944b lim: 120 exec/s: 67 rss: 70Mb L: 75/120 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:36.825 [2024-04-26 20:04:21.143662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.143689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.143759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.825 [2024-04-26 20:04:21.143780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.825 [2024-04-26 20:04:21.143845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6917529027641081856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.143864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.826 #68 NEW cov: 12014 ft: 15496 corp: 30/2022b lim: 120 exec/s: 68 rss: 70Mb L: 78/120 MS: 1 InsertByte- 00:08:36.826 [2024-04-26 20:04:21.193940] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.193968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.826 [2024-04-26 20:04:21.194040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:279275953455104 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.194056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.826 [2024-04-26 20:04:21.194136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:24576 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.194156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.826 #69 NEW cov: 12014 ft: 15517 corp: 31/2098b lim: 120 exec/s: 69 rss: 70Mb L: 76/120 MS: 1 ChangeBinInt- 00:08:36.826 [2024-04-26 20:04:21.243943] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.243970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:36.826 [2024-04-26 20:04:21.244041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:71783815642611712 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.244064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:36.826 [2024-04-26 20:04:21.244120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:6917529027641081856 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:36.826 [2024-04-26 20:04:21.244135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:36.826 #70 NEW cov: 12014 ft: 15522 corp: 32/2176b lim: 120 exec/s: 70 rss: 70Mb L: 78/120 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:37.085 [2024-04-26 20:04:21.293801] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:17651002168384091380 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.085 [2024-04-26 20:04:21.293827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.085 [2024-04-26 20:04:21.293899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:17651002172490708212 len:62709 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:37.085 [2024-04-26 20:04:21.293929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.085 #71 NEW cov: 12014 ft: 15532 corp: 33/2229b lim: 120 exec/s: 35 rss: 70Mb L: 53/120 MS: 1 PersAutoDict- DE: "\377\007"- 00:08:37.085 #71 DONE cov: 12014 ft: 15532 corp: 33/2229b lim: 120 exec/s: 35 rss: 70Mb 00:08:37.085 ###### Recommended dictionary. ###### 00:08:37.085 "\377\007" # Uses: 3 00:08:37.085 ###### End of recommended dictionary. ###### 00:08:37.085 Done 71 runs in 2 second(s) 00:08:37.085 20:04:21 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_17.conf /var/tmp/suppress_nvmf_fuzz 00:08:37.085 20:04:21 -- ../common.sh@72 -- # (( i++ )) 00:08:37.085 20:04:21 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:37.085 20:04:21 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:37.085 20:04:21 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:37.085 20:04:21 -- nvmf/run.sh@24 -- # local timen=1 00:08:37.085 20:04:21 -- nvmf/run.sh@25 -- # local core=0x1 00:08:37.085 20:04:21 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.085 20:04:21 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:37.085 20:04:21 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:37.085 20:04:21 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:37.085 20:04:21 -- nvmf/run.sh@34 -- # printf %02d 18 00:08:37.085 20:04:21 -- nvmf/run.sh@34 -- # port=4418 00:08:37.085 20:04:21 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.085 20:04:21 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:37.085 20:04:21 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:37.085 20:04:21 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:37.085 20:04:21 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:37.085 20:04:21 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 00:08:37.085 [2024-04-26 20:04:21.486224] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:37.085 [2024-04-26 20:04:21.486299] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627352 ] 00:08:37.085 EAL: No free 2048 kB hugepages reported on node 1 00:08:37.345 [2024-04-26 20:04:21.676033] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:37.345 [2024-04-26 20:04:21.748651] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.604 [2024-04-26 20:04:21.807808] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:37.604 [2024-04-26 20:04:21.824067] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:37.604 INFO: Running with entropic power schedule (0xFF, 100). 00:08:37.604 INFO: Seed: 1024996510 00:08:37.604 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:37.604 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:37.604 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:37.604 INFO: A corpus is not provided, starting from an empty corpus 00:08:37.604 #2 INITED exec/s: 0 rss: 63Mb 00:08:37.604 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:37.604 This may also happen if the target rejected all inputs we tried so far 00:08:37.604 [2024-04-26 20:04:21.868847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.604 [2024-04-26 20:04:21.868885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.604 [2024-04-26 20:04:21.868921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.604 [2024-04-26 20:04:21.868937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.604 [2024-04-26 20:04:21.868968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:37.604 [2024-04-26 20:04:21.868982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.604 [2024-04-26 20:04:21.869011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:37.604 [2024-04-26 20:04:21.869025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.864 NEW_FUNC[1/670]: 0x49f430 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:37.864 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:37.864 #5 NEW cov: 11713 ft: 11714 corp: 2/92b lim: 100 exec/s: 0 rss: 69Mb L: 91/91 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:08:37.864 [2024-04-26 20:04:22.209643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.864 [2024-04-26 20:04:22.209682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.209732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.864 [2024-04-26 20:04:22.209748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.209776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:37.864 [2024-04-26 20:04:22.209791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.209819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:37.864 [2024-04-26 20:04:22.209833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:37.864 #11 NEW cov: 11843 ft: 12192 corp: 3/184b lim: 100 exec/s: 0 rss: 69Mb L: 92/92 MS: 1 InsertByte- 00:08:37.864 [2024-04-26 20:04:22.279696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:37.864 [2024-04-26 20:04:22.279727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.279774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:37.864 [2024-04-26 20:04:22.279795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.279826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:37.864 [2024-04-26 20:04:22.279841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:37.864 [2024-04-26 20:04:22.279869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:37.864 [2024-04-26 20:04:22.279892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.124 #12 NEW cov: 11849 ft: 12489 corp: 4/266b lim: 100 exec/s: 0 rss: 69Mb L: 82/92 MS: 1 CrossOver- 00:08:38.124 [2024-04-26 20:04:22.329827] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.124 [2024-04-26 20:04:22.329855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.124 [2024-04-26 20:04:22.329907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.124 [2024-04-26 20:04:22.329924] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.124 [2024-04-26 20:04:22.329954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.124 [2024-04-26 20:04:22.329969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.124 [2024-04-26 20:04:22.329997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.124 [2024-04-26 20:04:22.330012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.124 #13 NEW cov: 11934 ft: 12723 corp: 5/357b lim: 100 exec/s: 0 rss: 69Mb L: 91/92 MS: 1 ChangeByte- 00:08:38.124 [2024-04-26 20:04:22.379962] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.124 [2024-04-26 20:04:22.379989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.124 [2024-04-26 20:04:22.380036] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.125 [2024-04-26 20:04:22.380053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.380082] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.125 [2024-04-26 20:04:22.380097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.380125] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.125 [2024-04-26 20:04:22.380139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.125 #14 NEW cov: 11934 ft: 12870 corp: 6/455b lim: 100 exec/s: 0 rss: 69Mb L: 98/98 MS: 1 InsertRepeatedBytes- 00:08:38.125 [2024-04-26 20:04:22.450127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.125 [2024-04-26 20:04:22.450154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.450200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.125 [2024-04-26 20:04:22.450217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.450246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.125 [2024-04-26 20:04:22.450260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.450293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.125 [2024-04-26 20:04:22.450308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.125 #15 NEW cov: 11934 ft: 12985 corp: 7/554b lim: 100 exec/s: 0 rss: 69Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:38.125 [2024-04-26 20:04:22.500281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.125 [2024-04-26 20:04:22.500308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.500354] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.125 [2024-04-26 20:04:22.500370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.500400] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.125 [2024-04-26 20:04:22.500414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.500443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.125 [2024-04-26 20:04:22.500458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.125 #18 NEW cov: 11934 ft: 13067 corp: 8/653b lim: 100 exec/s: 0 rss: 70Mb L: 99/99 MS: 3 ChangeBit-CMP-InsertRepeatedBytes- DE: "\271\030Z_\305\336\012\000"- 00:08:38.125 [2024-04-26 20:04:22.550379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.125 [2024-04-26 20:04:22.550406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.550452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.125 [2024-04-26 20:04:22.550469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.550498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.125 [2024-04-26 20:04:22.550513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.125 [2024-04-26 20:04:22.550540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.125 [2024-04-26 20:04:22.550554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.384 #19 NEW cov: 11934 ft: 13110 corp: 9/752b lim: 100 exec/s: 0 rss: 70Mb L: 99/99 MS: 1 ChangeBinInt- 00:08:38.384 [2024-04-26 20:04:22.620518] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.384 [2024-04-26 20:04:22.620546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.620594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.384 [2024-04-26 20:04:22.620609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.384 #20 NEW cov: 11934 ft: 13562 corp: 10/809b lim: 100 exec/s: 0 rss: 70Mb L: 57/99 MS: 1 CrossOver- 00:08:38.384 [2024-04-26 20:04:22.690867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.384 [2024-04-26 20:04:22.690901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.690947] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.384 [2024-04-26 20:04:22.690967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.690997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.384 [2024-04-26 20:04:22.691012] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.691039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.384 [2024-04-26 20:04:22.691054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.691081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:38.384 [2024-04-26 20:04:22.691095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.384 #21 NEW cov: 11934 ft: 13689 corp: 11/909b lim: 100 exec/s: 0 rss: 70Mb L: 100/100 MS: 1 CrossOver- 00:08:38.384 [2024-04-26 20:04:22.751044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.384 [2024-04-26 20:04:22.751073] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.751120] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.384 [2024-04-26 20:04:22.751138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.751168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.384 [2024-04-26 20:04:22.751184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.751212] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.384 [2024-04-26 20:04:22.751227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.751255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:38.384 [2024-04-26 20:04:22.751270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.384 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:38.384 #22 NEW cov: 11951 ft: 13780 corp: 12/1009b lim: 100 exec/s: 0 rss: 70Mb L: 100/100 MS: 1 InsertByte- 00:08:38.384 [2024-04-26 20:04:22.821131] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.384 [2024-04-26 20:04:22.821159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.384 [2024-04-26 20:04:22.821204] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.384 [2024-04-26 20:04:22.821221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.385 [2024-04-26 20:04:22.821251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.385 [2024-04-26 20:04:22.821265] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.385 [2024-04-26 20:04:22.821293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.385 [2024-04-26 20:04:22.821307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.644 #23 NEW cov: 11951 ft: 13811 corp: 13/1108b lim: 100 exec/s: 23 rss: 70Mb L: 99/100 MS: 1 CrossOver- 00:08:38.644 [2024-04-26 20:04:22.871273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.644 [2024-04-26 20:04:22.871300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:22.871347] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.644 [2024-04-26 20:04:22.871363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:22.871393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.644 [2024-04-26 20:04:22.871408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:22.871435] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.644 [2024-04-26 20:04:22.871449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.644 #24 NEW cov: 11951 ft: 13829 corp: 14/1190b lim: 100 exec/s: 24 rss: 70Mb L: 82/100 MS: 1 EraseBytes- 00:08:38.644 [2024-04-26 20:04:22.941369] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.644 [2024-04-26 20:04:22.941397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:22.941430] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.644 [2024-04-26 20:04:22.941447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.644 #25 NEW cov: 11951 ft: 13877 corp: 15/1247b lim: 100 exec/s: 25 rss: 70Mb L: 57/100 MS: 1 ChangeByte- 00:08:38.644 [2024-04-26 20:04:23.011602] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.644 [2024-04-26 20:04:23.011629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.011675] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.644 [2024-04-26 20:04:23.011692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.011721] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.644 [2024-04-26 20:04:23.011736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.011764] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.644 [2024-04-26 20:04:23.011778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.644 #26 NEW cov: 11951 ft: 13915 corp: 16/1345b lim: 100 exec/s: 26 rss: 70Mb L: 98/100 MS: 1 ChangeBit- 00:08:38.644 [2024-04-26 20:04:23.071793] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.644 [2024-04-26 20:04:23.071820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.071866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.644 [2024-04-26 20:04:23.071889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.071918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.644 [2024-04-26 20:04:23.071933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.644 [2024-04-26 20:04:23.071965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.644 [2024-04-26 20:04:23.071980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.904 #27 NEW cov: 11951 ft: 13936 corp: 17/1444b lim: 100 exec/s: 27 rss: 70Mb L: 99/100 MS: 1 CMP- DE: "\201\000\000\000\000\000\000\000"- 00:08:38.904 [2024-04-26 20:04:23.121940] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.904 [2024-04-26 20:04:23.121967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.122013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.904 [2024-04-26 20:04:23.122029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.122060] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.904 [2024-04-26 20:04:23.122074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.122102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.904 [2024-04-26 20:04:23.122116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.122144] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:38.904 [2024-04-26 20:04:23.122158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.904 #28 NEW cov: 11951 ft: 13946 corp: 18/1544b lim: 100 exec/s: 28 rss: 70Mb L: 100/100 MS: 1 InsertByte- 00:08:38.904 [2024-04-26 20:04:23.172100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.904 [2024-04-26 20:04:23.172126] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.172172] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.904 [2024-04-26 20:04:23.172189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.172218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.904 [2024-04-26 20:04:23.172233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.172261] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.904 [2024-04-26 20:04:23.172275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.172303] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:38.904 [2024-04-26 20:04:23.172317] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:38.904 #34 NEW cov: 11951 ft: 13993 corp: 19/1644b lim: 100 exec/s: 34 rss: 71Mb L: 100/100 MS: 1 InsertByte- 00:08:38.904 [2024-04-26 20:04:23.222163] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.904 [2024-04-26 20:04:23.222190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.222235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.904 [2024-04-26 20:04:23.222252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.222285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:38.904 [2024-04-26 20:04:23.222300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.222328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:38.904 [2024-04-26 20:04:23.222342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:38.904 #35 NEW cov: 11951 ft: 14008 corp: 20/1743b lim: 100 exec/s: 35 rss: 71Mb L: 99/100 MS: 1 ChangeBit- 00:08:38.904 [2024-04-26 20:04:23.282265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:38.904 [2024-04-26 20:04:23.282292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:38.904 [2024-04-26 20:04:23.282340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:38.904 [2024-04-26 20:04:23.282356] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:38.904 #36 NEW cov: 11951 ft: 14021 corp: 21/1801b lim: 100 exec/s: 36 rss: 71Mb L: 58/100 MS: 1 InsertByte- 00:08:39.163 [2024-04-26 20:04:23.352440] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.163 [2024-04-26 20:04:23.352469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.352502] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.163 [2024-04-26 20:04:23.352518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.163 #37 NEW cov: 11951 ft: 14032 corp: 22/1849b lim: 100 exec/s: 37 rss: 71Mb L: 48/100 MS: 1 EraseBytes- 00:08:39.163 [2024-04-26 20:04:23.402652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.163 [2024-04-26 20:04:23.402679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.402724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.163 [2024-04-26 20:04:23.402741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.402771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.163 [2024-04-26 20:04:23.402785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.402813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.163 [2024-04-26 20:04:23.402828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.163 #38 NEW cov: 11951 ft: 14052 corp: 23/1941b lim: 100 exec/s: 38 rss: 71Mb L: 92/100 MS: 1 ChangeBinInt- 00:08:39.163 [2024-04-26 20:04:23.473028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.163 [2024-04-26 20:04:23.473056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.473088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.163 [2024-04-26 20:04:23.473105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.473135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.163 [2024-04-26 20:04:23.473150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.473183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.163 [2024-04-26 20:04:23.473198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.163 #39 NEW cov: 11951 ft: 14082 corp: 24/2024b lim: 100 exec/s: 39 rss: 71Mb L: 83/100 MS: 1 InsertByte- 00:08:39.163 [2024-04-26 20:04:23.533114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.163 [2024-04-26 20:04:23.533140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.533186] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.163 [2024-04-26 20:04:23.533202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.533232] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.163 [2024-04-26 20:04:23.533246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.533274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.163 [2024-04-26 20:04:23.533289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.163 #40 NEW cov: 11951 ft: 14094 corp: 25/2116b lim: 100 exec/s: 40 rss: 72Mb L: 92/100 MS: 1 CrossOver- 00:08:39.163 [2024-04-26 20:04:23.603350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.163 [2024-04-26 20:04:23.603379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.603427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.163 [2024-04-26 20:04:23.603445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.603475] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.163 [2024-04-26 20:04:23.603491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.163 [2024-04-26 20:04:23.603520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.164 [2024-04-26 20:04:23.603537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.423 #41 NEW cov: 11951 ft: 14103 corp: 26/2215b lim: 100 exec/s: 41 rss: 72Mb L: 99/100 MS: 1 InsertRepeatedBytes- 00:08:39.423 [2024-04-26 20:04:23.673484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.423 [2024-04-26 20:04:23.673511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.673557] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.423 [2024-04-26 20:04:23.673574] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.673603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.423 [2024-04-26 20:04:23.673618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.673646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.423 [2024-04-26 20:04:23.673660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.423 #42 NEW cov: 11951 ft: 14117 corp: 27/2307b lim: 100 exec/s: 42 rss: 72Mb L: 92/100 MS: 1 InsertByte- 00:08:39.423 [2024-04-26 20:04:23.723607] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.423 [2024-04-26 20:04:23.723633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.723679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.423 [2024-04-26 20:04:23.723696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.723725] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.423 [2024-04-26 20:04:23.723740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.723768] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.423 [2024-04-26 20:04:23.723782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.423 #43 NEW cov: 11958 ft: 14131 corp: 28/2405b lim: 100 exec/s: 43 rss: 72Mb L: 98/100 MS: 1 ChangeByte- 00:08:39.423 [2024-04-26 20:04:23.773660] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.423 [2024-04-26 20:04:23.773686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.773734] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.423 [2024-04-26 20:04:23.773750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.423 #44 NEW cov: 11958 ft: 14150 corp: 29/2461b lim: 100 exec/s: 44 rss: 72Mb L: 56/100 MS: 1 EraseBytes- 00:08:39.423 [2024-04-26 20:04:23.843928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:39.423 [2024-04-26 20:04:23.843954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.844000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:39.423 [2024-04-26 20:04:23.844017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.844046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:39.423 [2024-04-26 20:04:23.844061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:39.423 [2024-04-26 20:04:23.844089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:39.423 [2024-04-26 20:04:23.844103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:39.683 #45 NEW cov: 11958 ft: 14155 corp: 30/2560b lim: 100 exec/s: 22 rss: 72Mb L: 99/100 MS: 1 ChangeBit- 00:08:39.683 #45 DONE cov: 11958 ft: 14155 corp: 30/2560b lim: 100 exec/s: 22 rss: 72Mb 00:08:39.683 ###### Recommended dictionary. ###### 00:08:39.683 "\271\030Z_\305\336\012\000" # Uses: 0 00:08:39.683 "\201\000\000\000\000\000\000\000" # Uses: 2 00:08:39.683 ###### End of recommended dictionary. ###### 00:08:39.683 Done 45 runs in 2 second(s) 00:08:39.683 20:04:24 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_18.conf /var/tmp/suppress_nvmf_fuzz 00:08:39.683 20:04:24 -- ../common.sh@72 -- # (( i++ )) 00:08:39.683 20:04:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.683 20:04:24 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:39.683 20:04:24 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:39.683 20:04:24 -- nvmf/run.sh@24 -- # local timen=1 00:08:39.683 20:04:24 -- nvmf/run.sh@25 -- # local core=0x1 00:08:39.683 20:04:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:39.683 20:04:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:39.683 20:04:24 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:39.683 20:04:24 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:39.683 20:04:24 -- nvmf/run.sh@34 -- # printf %02d 19 00:08:39.683 20:04:24 -- nvmf/run.sh@34 -- # port=4419 00:08:39.683 20:04:24 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:39.683 20:04:24 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:39.683 20:04:24 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:39.683 20:04:24 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:39.683 20:04:24 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:39.683 20:04:24 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 00:08:39.683 [2024-04-26 20:04:24.060804] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:39.683 [2024-04-26 20:04:24.060892] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627688 ] 00:08:39.683 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.942 [2024-04-26 20:04:24.254718] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.942 [2024-04-26 20:04:24.325788] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.942 [2024-04-26 20:04:24.385001] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:40.201 [2024-04-26 20:04:24.401206] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:40.201 INFO: Running with entropic power schedule (0xFF, 100). 00:08:40.201 INFO: Seed: 3602998862 00:08:40.201 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:40.201 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:40.201 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:40.201 INFO: A corpus is not provided, starting from an empty corpus 00:08:40.201 #2 INITED exec/s: 0 rss: 63Mb 00:08:40.201 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:40.201 This may also happen if the target rejected all inputs we tried so far 00:08:40.201 [2024-04-26 20:04:24.456586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.201 [2024-04-26 20:04:24.456617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.201 [2024-04-26 20:04:24.456650] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.201 [2024-04-26 20:04:24.456665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.201 [2024-04-26 20:04:24.456716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.201 [2024-04-26 20:04:24.456730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.461 NEW_FUNC[1/670]: 0x4a23f0 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:40.461 NEW_FUNC[2/670]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:40.461 #16 NEW cov: 11687 ft: 11688 corp: 2/37b lim: 50 exec/s: 0 rss: 69Mb L: 36/36 MS: 4 CopyPart-CrossOver-InsertByte-InsertRepeatedBytes- 00:08:40.461 [2024-04-26 20:04:24.788757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.461 [2024-04-26 20:04:24.788806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.461 [2024-04-26 20:04:24.788916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65350 00:08:40.461 [2024-04-26 20:04:24.788939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.461 [2024-04-26 20:04:24.789033] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.461 [2024-04-26 20:04:24.789057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.461 #17 NEW cov: 11821 ft: 12205 corp: 3/73b lim: 50 exec/s: 0 rss: 69Mb L: 36/36 MS: 1 ChangeByte- 00:08:40.461 [2024-04-26 20:04:24.849361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.462 [2024-04-26 20:04:24.849390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.849461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.462 [2024-04-26 20:04:24.849479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.849552] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.462 [2024-04-26 20:04:24.849568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.849643] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:40.462 [2024-04-26 20:04:24.849662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.849747] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:40.462 [2024-04-26 20:04:24.849766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:40.462 #23 NEW cov: 11827 ft: 12791 corp: 4/123b lim: 50 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:40.462 [2024-04-26 20:04:24.899369] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:40.462 [2024-04-26 20:04:24.899398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.899464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:40.462 [2024-04-26 20:04:24.899483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.899560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:40.462 [2024-04-26 20:04:24.899579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.462 [2024-04-26 20:04:24.899671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 00:08:40.462 [2024-04-26 20:04:24.899687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.721 #28 NEW cov: 11912 ft: 13048 corp: 5/167b lim: 50 exec/s: 0 rss: 69Mb L: 44/50 MS: 5 CrossOver-CMP-InsertByte-ShuffleBytes-InsertRepeatedBytes- DE: "\001\000\000\001"- 00:08:40.721 [2024-04-26 20:04:24.949505] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:40.721 [2024-04-26 20:04:24.949532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:24.949597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:40.721 [2024-04-26 20:04:24.949614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:24.949679] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:40.721 [2024-04-26 20:04:24.949695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:24.949781] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 00:08:40.721 [2024-04-26 20:04:24.949799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.721 #29 NEW cov: 11912 ft: 13112 corp: 6/211b lim: 50 exec/s: 0 rss: 70Mb L: 44/50 MS: 1 ShuffleBytes- 00:08:40.721 [2024-04-26 20:04:25.009504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.721 [2024-04-26 20:04:25.009529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:25.009610] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.721 [2024-04-26 20:04:25.009626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:25.009704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709503999 len:65536 00:08:40.721 [2024-04-26 20:04:25.009722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.721 #30 NEW cov: 11912 ft: 13181 corp: 7/250b lim: 50 exec/s: 0 rss: 70Mb L: 39/50 MS: 1 InsertRepeatedBytes- 00:08:40.721 [2024-04-26 20:04:25.059959] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.721 [2024-04-26 20:04:25.059986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:25.060067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:2199023255297 len:65536 00:08:40.721 [2024-04-26 20:04:25.060084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.721 [2024-04-26 20:04:25.060159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446539564546785279 len:65536 00:08:40.721 [2024-04-26 20:04:25.060177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.722 [2024-04-26 20:04:25.060272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:40.722 [2024-04-26 20:04:25.060292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.722 #31 NEW cov: 11912 ft: 13257 corp: 8/293b lim: 50 exec/s: 0 rss: 70Mb L: 43/50 MS: 1 PersAutoDict- DE: "\001\000\000\001"- 00:08:40.722 [2024-04-26 20:04:25.120117] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:40.722 [2024-04-26 20:04:25.120143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.722 [2024-04-26 20:04:25.120212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3151601358058863804 len:48317 00:08:40.722 [2024-04-26 20:04:25.120228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.722 [2024-04-26 20:04:25.120286] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:40.722 [2024-04-26 20:04:25.120302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.722 [2024-04-26 20:04:25.120390] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 00:08:40.722 [2024-04-26 20:04:25.120409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:40.722 #32 NEW cov: 11912 ft: 13283 corp: 9/338b lim: 50 exec/s: 0 rss: 70Mb L: 45/50 MS: 1 InsertByte- 00:08:40.981 [2024-04-26 20:04:25.169997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.981 [2024-04-26 20:04:25.170025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.981 [2024-04-26 20:04:25.170093] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:4 00:08:40.981 [2024-04-26 20:04:25.170114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.981 [2024-04-26 20:04:25.170181] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.981 [2024-04-26 20:04:25.170202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.981 #33 NEW cov: 11912 ft: 13312 corp: 10/374b lim: 50 exec/s: 0 rss: 70Mb L: 36/50 MS: 1 ChangeBinInt- 00:08:40.981 [2024-04-26 20:04:25.220179] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.981 [2024-04-26 20:04:25.220204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.981 [2024-04-26 20:04:25.220277] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.981 [2024-04-26 20:04:25.220295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.981 [2024-04-26 20:04:25.220365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:40.982 [2024-04-26 20:04:25.220381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.982 #34 NEW cov: 11912 ft: 13349 corp: 11/410b lim: 50 exec/s: 0 rss: 70Mb L: 36/50 MS: 1 ChangeByte- 00:08:40.982 [2024-04-26 20:04:25.270400] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:40.982 [2024-04-26 20:04:25.270429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.982 [2024-04-26 20:04:25.270482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:40.982 [2024-04-26 20:04:25.270502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.982 [2024-04-26 20:04:25.270550] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:40.982 [2024-04-26 20:04:25.270567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.982 #35 NEW cov: 11912 ft: 13390 corp: 12/442b lim: 50 exec/s: 0 rss: 70Mb L: 32/50 MS: 1 EraseBytes- 00:08:40.982 [2024-04-26 20:04:25.320555] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:40.982 [2024-04-26 20:04:25.320582] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.982 [2024-04-26 20:04:25.320658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:40.982 [2024-04-26 20:04:25.320676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.982 [2024-04-26 20:04:25.320752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:40.982 [2024-04-26 20:04:25.320769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:40.982 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.982 #36 NEW cov: 11935 ft: 13502 corp: 13/477b lim: 50 exec/s: 0 rss: 70Mb L: 35/50 MS: 1 EraseBytes- 00:08:40.982 [2024-04-26 20:04:25.380548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:40.982 [2024-04-26 20:04:25.380576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:40.982 [2024-04-26 20:04:25.380656] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:40.982 [2024-04-26 20:04:25.380674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:40.982 #37 NEW cov: 11935 ft: 13765 corp: 14/504b lim: 50 exec/s: 0 rss: 70Mb L: 27/50 MS: 1 EraseBytes- 00:08:41.241 [2024-04-26 20:04:25.431230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:14136 00:08:41.241 [2024-04-26 20:04:25.431259] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.431320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3978930267413231415 len:65536 00:08:41.241 [2024-04-26 20:04:25.431340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.431399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.241 [2024-04-26 20:04:25.431418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.431509] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.241 [2024-04-26 20:04:25.431527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.241 #38 NEW cov: 11935 ft: 13793 corp: 15/548b lim: 50 exec/s: 38 rss: 70Mb L: 44/50 MS: 1 InsertRepeatedBytes- 00:08:41.241 [2024-04-26 20:04:25.481199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:41.241 [2024-04-26 20:04:25.481235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.481320] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446462603027808255 len:4 00:08:41.241 [2024-04-26 20:04:25.481341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.481429] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.241 [2024-04-26 20:04:25.481447] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.241 #39 NEW cov: 11935 ft: 13862 corp: 16/584b lim: 50 exec/s: 39 rss: 70Mb L: 36/50 MS: 1 ShuffleBytes- 00:08:41.241 [2024-04-26 20:04:25.541306] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:41.241 [2024-04-26 20:04:25.541339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.541398] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:41.241 [2024-04-26 20:04:25.541419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.541476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:41.241 [2024-04-26 20:04:25.541495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.241 #40 NEW cov: 11935 ft: 13873 corp: 17/621b lim: 50 exec/s: 40 rss: 70Mb L: 37/50 MS: 1 EraseBytes- 00:08:41.241 [2024-04-26 20:04:25.601576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:41.241 [2024-04-26 20:04:25.601604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.601672] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:41.241 [2024-04-26 20:04:25.601688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.601752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:125182408453586944 len:48317 00:08:41.241 [2024-04-26 20:04:25.601768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.241 #41 NEW cov: 11935 ft: 13884 corp: 18/660b lim: 50 exec/s: 41 rss: 70Mb L: 39/50 MS: 1 PersAutoDict- DE: "\001\000\000\001"- 00:08:41.241 [2024-04-26 20:04:25.661495] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:41.241 [2024-04-26 20:04:25.661524] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.241 [2024-04-26 20:04:25.661583] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4294967295 len:1 00:08:41.241 [2024-04-26 20:04:25.661603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 #42 NEW cov: 11935 ft: 13917 corp: 19/687b lim: 50 exec/s: 42 rss: 70Mb L: 27/50 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\017"- 00:08:41.501 [2024-04-26 20:04:25.722256] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:671088640 len:1 00:08:41.501 [2024-04-26 20:04:25.722287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.722340] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:41.501 [2024-04-26 20:04:25.722357] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.722408] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:41.501 [2024-04-26 20:04:25.722426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.722514] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:41.501 [2024-04-26 20:04:25.722532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.501 #45 NEW cov: 11935 ft: 13966 corp: 20/732b lim: 50 exec/s: 45 rss: 70Mb L: 45/50 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:41.501 [2024-04-26 20:04:25.772004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:41.501 [2024-04-26 20:04:25.772030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.772098] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5044031582654955519 len:65536 00:08:41.501 [2024-04-26 20:04:25.772116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.772193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.501 [2024-04-26 20:04:25.772210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.501 #46 NEW cov: 11935 ft: 13982 corp: 21/763b lim: 50 exec/s: 46 rss: 70Mb L: 31/50 MS: 1 EraseBytes- 00:08:41.501 [2024-04-26 20:04:25.822439] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:12225489210306046377 len:43434 00:08:41.501 [2024-04-26 20:04:25.822465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.822534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:12225489209634957737 len:43434 00:08:41.501 [2024-04-26 20:04:25.822554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.822618] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:12225489209634957737 len:43434 00:08:41.501 [2024-04-26 20:04:25.822634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.822725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:12225489209634957737 len:43434 00:08:41.501 [2024-04-26 20:04:25.822746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.501 #48 NEW cov: 11935 ft: 13995 corp: 22/808b lim: 50 exec/s: 48 rss: 70Mb L: 45/50 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:41.501 [2024-04-26 20:04:25.872164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:41.501 [2024-04-26 20:04:25.872189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.872242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.501 [2024-04-26 20:04:25.872262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 #49 NEW cov: 11935 ft: 14003 corp: 23/829b lim: 50 exec/s: 49 rss: 71Mb L: 21/50 MS: 1 EraseBytes- 00:08:41.501 [2024-04-26 20:04:25.922759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:41.501 [2024-04-26 20:04:25.922784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.922851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3151601358058863804 len:48317 00:08:41.501 [2024-04-26 20:04:25.922868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.922961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:41.501 [2024-04-26 20:04:25.922982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.501 [2024-04-26 20:04:25.923065] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13599952493558414524 len:48317 00:08:41.501 [2024-04-26 20:04:25.923084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.501 #50 NEW cov: 11935 ft: 14020 corp: 24/871b lim: 50 exec/s: 50 rss: 71Mb L: 42/50 MS: 1 EraseBytes- 00:08:41.762 [2024-04-26 20:04:25.973193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583021857 len:65536 00:08:41.762 [2024-04-26 20:04:25.973220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:25.973310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:25.973329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:25.973404] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:25.973422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:25.973506] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:25.973526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:25.973608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:25.973626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:41.762 #51 NEW cov: 11935 ft: 14068 corp: 25/921b lim: 50 exec/s: 51 rss: 71Mb L: 50/50 MS: 1 ChangeByte- 00:08:41.762 [2024-04-26 20:04:26.033402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446743180524791807 len:65536 00:08:41.762 [2024-04-26 20:04:26.033429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.033502] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.033519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.033579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.033597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.033677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.033694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.033777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.033795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:41.762 #52 NEW cov: 11935 ft: 14113 corp: 26/971b lim: 50 exec/s: 52 rss: 71Mb L: 50/50 MS: 1 ChangeByte- 00:08:41.762 [2024-04-26 20:04:26.083311] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:41.762 [2024-04-26 20:04:26.083336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.083405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709502975 len:65536 00:08:41.762 [2024-04-26 20:04:26.083422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.083486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551429 len:65536 00:08:41.762 [2024-04-26 20:04:26.083501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.083586] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.083602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.762 #53 NEW cov: 11935 ft: 14119 corp: 27/1011b lim: 50 exec/s: 53 rss: 71Mb L: 40/50 MS: 1 InsertByte- 00:08:41.762 [2024-04-26 20:04:26.133374] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:41.762 [2024-04-26 20:04:26.133400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.133466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709502975 len:65536 00:08:41.762 [2024-04-26 20:04:26.133482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.133570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18376657904020225861 len:65536 00:08:41.762 [2024-04-26 20:04:26.133586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.762 [2024-04-26 20:04:26.133671] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.762 [2024-04-26 20:04:26.133687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:41.762 #54 NEW cov: 11935 ft: 14127 corp: 28/1053b lim: 50 exec/s: 54 rss: 71Mb L: 42/50 MS: 1 CMP- DE: "\007\000"- 00:08:41.762 [2024-04-26 20:04:26.193601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583022079 len:65536 00:08:41.762 [2024-04-26 20:04:26.193630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:41.763 [2024-04-26 20:04:26.193692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709502975 len:65536 00:08:41.763 [2024-04-26 20:04:26.193710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:41.763 [2024-04-26 20:04:26.193765] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551429 len:65536 00:08:41.763 [2024-04-26 20:04:26.193781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:41.763 [2024-04-26 20:04:26.193869] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:41.763 [2024-04-26 20:04:26.193891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.023 #55 NEW cov: 11935 ft: 14201 corp: 29/1093b lim: 50 exec/s: 55 rss: 71Mb L: 40/50 MS: 1 ShuffleBytes- 00:08:42.023 [2024-04-26 20:04:26.243776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:42.023 [2024-04-26 20:04:26.243803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.243877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 00:08:42.023 [2024-04-26 20:04:26.243893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.243981] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:42.023 [2024-04-26 20:04:26.244000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.244083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:488993783021824 len:48317 00:08:42.023 [2024-04-26 20:04:26.244100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.023 #56 NEW cov: 11935 ft: 14216 corp: 30/1141b lim: 50 exec/s: 56 rss: 71Mb L: 48/50 MS: 1 PersAutoDict- DE: "\001\000\000\001"- 00:08:42.023 [2024-04-26 20:04:26.293461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583077375 len:65536 00:08:42.023 [2024-04-26 20:04:26.293486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.293540] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:42.023 [2024-04-26 20:04:26.293556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.023 #57 NEW cov: 11935 ft: 14228 corp: 31/1162b lim: 50 exec/s: 57 rss: 71Mb L: 21/50 MS: 1 CopyPart- 00:08:42.023 [2024-04-26 20:04:26.344088] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:13599952492690407425 len:48317 00:08:42.023 [2024-04-26 20:04:26.344114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.344190] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:3151601358058863804 len:48317 00:08:42.023 [2024-04-26 20:04:26.344209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.344265] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:13599952493558414524 len:48317 00:08:42.023 [2024-04-26 20:04:26.344281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.344365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:13599952493562674364 len:48317 00:08:42.023 [2024-04-26 20:04:26.344382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.023 #58 NEW cov: 11935 ft: 14250 corp: 32/1208b lim: 50 exec/s: 58 rss: 71Mb L: 46/50 MS: 1 InsertByte- 00:08:42.023 [2024-04-26 20:04:26.394008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583020031 len:65536 00:08:42.023 [2024-04-26 20:04:26.394033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.394099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:5044031582654955519 len:65536 00:08:42.023 [2024-04-26 20:04:26.394117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.394202] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:42.023 [2024-04-26 20:04:26.394220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.023 #59 NEW cov: 11935 ft: 14274 corp: 33/1239b lim: 50 exec/s: 59 rss: 71Mb L: 31/50 MS: 1 ChangeByte- 00:08:42.023 [2024-04-26 20:04:26.444647] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744069583021857 len:65536 00:08:42.023 [2024-04-26 20:04:26.444673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.444753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551551 len:65536 00:08:42.023 [2024-04-26 20:04:26.444770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.444845] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:42.023 [2024-04-26 20:04:26.444862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.444957] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:42.023 [2024-04-26 20:04:26.444977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:42.023 [2024-04-26 20:04:26.445070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 00:08:42.023 [2024-04-26 20:04:26.445086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:42.284 #60 NEW cov: 11935 ft: 14308 corp: 34/1289b lim: 50 exec/s: 30 rss: 71Mb L: 50/50 MS: 1 ChangeBit- 00:08:42.284 #60 DONE cov: 11935 ft: 14308 corp: 34/1289b lim: 50 exec/s: 30 rss: 71Mb 00:08:42.284 ###### Recommended dictionary. ###### 00:08:42.284 "\001\000\000\001" # Uses: 3 00:08:42.284 "\000\000\000\000\000\000\000\017" # Uses: 0 00:08:42.284 "\007\000" # Uses: 0 00:08:42.284 ###### End of recommended dictionary. ###### 00:08:42.284 Done 60 runs in 2 second(s) 00:08:42.284 20:04:26 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_19.conf /var/tmp/suppress_nvmf_fuzz 00:08:42.284 20:04:26 -- ../common.sh@72 -- # (( i++ )) 00:08:42.284 20:04:26 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.284 20:04:26 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:42.284 20:04:26 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:42.284 20:04:26 -- nvmf/run.sh@24 -- # local timen=1 00:08:42.284 20:04:26 -- nvmf/run.sh@25 -- # local core=0x1 00:08:42.284 20:04:26 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.284 20:04:26 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:42.284 20:04:26 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:42.284 20:04:26 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:42.284 20:04:26 -- nvmf/run.sh@34 -- # printf %02d 20 00:08:42.284 20:04:26 -- nvmf/run.sh@34 -- # port=4420 00:08:42.284 20:04:26 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.284 20:04:26 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:42.284 20:04:26 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:42.284 20:04:26 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:42.284 20:04:26 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:42.284 20:04:26 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 00:08:42.284 [2024-04-26 20:04:26.645309] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:42.284 [2024-04-26 20:04:26.645394] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628043 ] 00:08:42.284 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.544 [2024-04-26 20:04:26.834887] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.544 [2024-04-26 20:04:26.907241] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.544 [2024-04-26 20:04:26.966439] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:42.544 [2024-04-26 20:04:26.982671] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:42.804 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.804 INFO: Seed: 1888033233 00:08:42.804 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:42.804 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:42.804 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:42.804 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.804 #2 INITED exec/s: 0 rss: 63Mb 00:08:42.804 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.804 This may also happen if the target rejected all inputs we tried so far 00:08:42.804 [2024-04-26 20:04:27.037979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:42.804 [2024-04-26 20:04:27.038010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:42.804 [2024-04-26 20:04:27.038069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:42.804 [2024-04-26 20:04:27.038084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.064 NEW_FUNC[1/672]: 0x4a3fb0 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:43.064 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:43.064 #35 NEW cov: 11749 ft: 11750 corp: 2/37b lim: 90 exec/s: 0 rss: 69Mb L: 36/36 MS: 3 CrossOver-ChangeBit-InsertRepeatedBytes- 00:08:43.064 [2024-04-26 20:04:27.358804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.064 [2024-04-26 20:04:27.358842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.064 [2024-04-26 20:04:27.358904] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.064 [2024-04-26 20:04:27.358920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.064 #41 NEW cov: 11879 ft: 12157 corp: 3/75b lim: 90 exec/s: 0 rss: 69Mb L: 38/38 MS: 1 CopyPart- 00:08:43.064 [2024-04-26 20:04:27.408879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.064 [2024-04-26 20:04:27.408906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.064 [2024-04-26 20:04:27.408948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.064 [2024-04-26 20:04:27.408964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.064 #42 NEW cov: 11885 ft: 12460 corp: 4/111b lim: 90 exec/s: 0 rss: 69Mb L: 36/38 MS: 1 EraseBytes- 00:08:43.064 [2024-04-26 20:04:27.448857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.064 [2024-04-26 20:04:27.448889] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.064 #44 NEW cov: 11970 ft: 13565 corp: 5/129b lim: 90 exec/s: 0 rss: 69Mb L: 18/38 MS: 2 InsertRepeatedBytes-CopyPart- 00:08:43.064 [2024-04-26 20:04:27.488972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.064 [2024-04-26 20:04:27.488998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 #45 NEW cov: 11970 ft: 13630 corp: 6/148b lim: 90 exec/s: 0 rss: 69Mb L: 19/38 MS: 1 EraseBytes- 00:08:43.323 [2024-04-26 20:04:27.529266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.529292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 [2024-04-26 20:04:27.529329] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.323 [2024-04-26 20:04:27.529345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.323 #53 NEW cov: 11970 ft: 13730 corp: 7/191b lim: 90 exec/s: 0 rss: 69Mb L: 43/43 MS: 3 CMP-CopyPart-InsertRepeatedBytes- DE: "\000\000\000/"- 00:08:43.323 [2024-04-26 20:04:27.569200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.569226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 #54 NEW cov: 11970 ft: 13869 corp: 8/210b lim: 90 exec/s: 0 rss: 70Mb L: 19/43 MS: 1 ChangeBinInt- 00:08:43.323 [2024-04-26 20:04:27.619377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.619403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 #55 NEW cov: 11970 ft: 13969 corp: 9/230b lim: 90 exec/s: 0 rss: 70Mb L: 20/43 MS: 1 InsertByte- 00:08:43.323 [2024-04-26 20:04:27.669509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.669537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 #56 NEW cov: 11970 ft: 13985 corp: 10/253b lim: 90 exec/s: 0 rss: 70Mb L: 23/43 MS: 1 PersAutoDict- DE: "\000\000\000/"- 00:08:43.323 [2024-04-26 20:04:27.709578] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.709606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.323 #57 NEW cov: 11970 ft: 14065 corp: 11/283b lim: 90 exec/s: 0 rss: 70Mb L: 30/43 MS: 1 CopyPart- 00:08:43.323 [2024-04-26 20:04:27.759761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.323 [2024-04-26 20:04:27.759788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 #58 NEW cov: 11970 ft: 14078 corp: 12/303b lim: 90 exec/s: 0 rss: 70Mb L: 20/43 MS: 1 PersAutoDict- DE: "\000\000\000/"- 00:08:43.582 [2024-04-26 20:04:27.810066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:27.810092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.810130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.582 [2024-04-26 20:04:27.810146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.582 #59 NEW cov: 11970 ft: 14088 corp: 13/339b lim: 90 exec/s: 0 rss: 70Mb L: 36/43 MS: 1 ShuffleBytes- 00:08:43.582 [2024-04-26 20:04:27.849975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:27.850003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 #60 NEW cov: 11970 ft: 14121 corp: 14/359b lim: 90 exec/s: 0 rss: 70Mb L: 20/43 MS: 1 InsertByte- 00:08:43.582 [2024-04-26 20:04:27.890260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:27.890288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.890325] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.582 [2024-04-26 20:04:27.890341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.582 #61 NEW cov: 11970 ft: 14129 corp: 15/397b lim: 90 exec/s: 0 rss: 70Mb L: 38/43 MS: 1 PersAutoDict- DE: "\000\000\000/"- 00:08:43.582 [2024-04-26 20:04:27.930696] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:27.930723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.930772] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.582 [2024-04-26 20:04:27.930787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.930844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.582 [2024-04-26 20:04:27.930859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.930918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:43.582 [2024-04-26 20:04:27.930934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:43.582 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.582 #62 NEW cov: 11993 ft: 14582 corp: 16/472b lim: 90 exec/s: 0 rss: 70Mb L: 75/75 MS: 1 InsertRepeatedBytes- 00:08:43.582 [2024-04-26 20:04:27.980673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:27.980703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.980740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.582 [2024-04-26 20:04:27.980755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:27.980812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:43.582 [2024-04-26 20:04:27.980827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:43.582 #63 NEW cov: 11993 ft: 14885 corp: 17/526b lim: 90 exec/s: 0 rss: 70Mb L: 54/75 MS: 1 CrossOver- 00:08:43.582 [2024-04-26 20:04:28.020652] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.582 [2024-04-26 20:04:28.020678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.582 [2024-04-26 20:04:28.020728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.582 [2024-04-26 20:04:28.020745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.842 #64 NEW cov: 11993 ft: 14918 corp: 18/564b lim: 90 exec/s: 64 rss: 70Mb L: 38/75 MS: 1 ChangeBit- 00:08:43.842 [2024-04-26 20:04:28.060594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.060622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 #67 NEW cov: 11993 ft: 14930 corp: 19/583b lim: 90 exec/s: 67 rss: 70Mb L: 19/75 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:08:43.842 [2024-04-26 20:04:28.100868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.100901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 [2024-04-26 20:04:28.100941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.842 [2024-04-26 20:04:28.100958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.842 #68 NEW cov: 11993 ft: 14955 corp: 20/622b lim: 90 exec/s: 68 rss: 70Mb L: 39/75 MS: 1 InsertRepeatedBytes- 00:08:43.842 [2024-04-26 20:04:28.140860] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.140892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 #69 NEW cov: 11993 ft: 14965 corp: 21/641b lim: 90 exec/s: 69 rss: 70Mb L: 19/75 MS: 1 ShuffleBytes- 00:08:43.842 [2024-04-26 20:04:28.181110] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.181136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 [2024-04-26 20:04:28.181176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:43.842 [2024-04-26 20:04:28.181192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:43.842 #70 NEW cov: 11993 ft: 14990 corp: 22/677b lim: 90 exec/s: 70 rss: 70Mb L: 36/75 MS: 1 ChangeBit- 00:08:43.842 [2024-04-26 20:04:28.221055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.221081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 #71 NEW cov: 11993 ft: 15123 corp: 23/705b lim: 90 exec/s: 71 rss: 70Mb L: 28/75 MS: 1 CMP- DE: "\325\003\000\000\000\000\000\000"- 00:08:43.842 [2024-04-26 20:04:28.261211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:43.842 [2024-04-26 20:04:28.261239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:43.842 #72 NEW cov: 11993 ft: 15141 corp: 24/723b lim: 90 exec/s: 72 rss: 70Mb L: 18/75 MS: 1 ChangeByte- 00:08:44.101 [2024-04-26 20:04:28.301295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.101 [2024-04-26 20:04:28.301322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.101 #73 NEW cov: 11993 ft: 15174 corp: 25/751b lim: 90 exec/s: 73 rss: 70Mb L: 28/75 MS: 1 ShuffleBytes- 00:08:44.101 [2024-04-26 20:04:28.351416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.101 [2024-04-26 20:04:28.351443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.101 #74 NEW cov: 11993 ft: 15199 corp: 26/770b lim: 90 exec/s: 74 rss: 71Mb L: 19/75 MS: 1 ChangeBinInt- 00:08:44.101 [2024-04-26 20:04:28.401541] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.102 [2024-04-26 20:04:28.401569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.102 #75 NEW cov: 11993 ft: 15227 corp: 27/789b lim: 90 exec/s: 75 rss: 71Mb L: 19/75 MS: 1 CMP- DE: "\001\000\177#\210\016\354\273"- 00:08:44.102 [2024-04-26 20:04:28.442020] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.102 [2024-04-26 20:04:28.442046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.102 [2024-04-26 20:04:28.442094] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.102 [2024-04-26 20:04:28.442110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.102 [2024-04-26 20:04:28.442168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.102 [2024-04-26 20:04:28.442184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.102 #76 NEW cov: 11993 ft: 15228 corp: 28/844b lim: 90 exec/s: 76 rss: 71Mb L: 55/75 MS: 1 InsertByte- 00:08:44.102 [2024-04-26 20:04:28.491839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.102 [2024-04-26 20:04:28.491866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.102 #77 NEW cov: 11993 ft: 15291 corp: 29/864b lim: 90 exec/s: 77 rss: 71Mb L: 20/75 MS: 1 ChangeBit- 00:08:44.102 [2024-04-26 20:04:28.542025] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.102 [2024-04-26 20:04:28.542053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 #78 NEW cov: 11993 ft: 15299 corp: 30/898b lim: 90 exec/s: 78 rss: 71Mb L: 34/75 MS: 1 EraseBytes- 00:08:44.361 [2024-04-26 20:04:28.592108] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.592134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 #79 NEW cov: 11993 ft: 15304 corp: 31/917b lim: 90 exec/s: 79 rss: 71Mb L: 19/75 MS: 1 ChangeBinInt- 00:08:44.361 [2024-04-26 20:04:28.632388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.632417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 [2024-04-26 20:04:28.632467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.361 [2024-04-26 20:04:28.632483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.361 #80 NEW cov: 11993 ft: 15312 corp: 32/953b lim: 90 exec/s: 80 rss: 71Mb L: 36/75 MS: 1 ChangeBinInt- 00:08:44.361 [2024-04-26 20:04:28.672361] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.672387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 #81 NEW cov: 11993 ft: 15314 corp: 33/971b lim: 90 exec/s: 81 rss: 71Mb L: 18/75 MS: 1 ChangeBinInt- 00:08:44.361 [2024-04-26 20:04:28.712507] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.712534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 #82 NEW cov: 11993 ft: 15376 corp: 34/989b lim: 90 exec/s: 82 rss: 71Mb L: 18/75 MS: 1 ChangeBinInt- 00:08:44.361 [2024-04-26 20:04:28.752932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.752958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 [2024-04-26 20:04:28.752995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.361 [2024-04-26 20:04:28.753010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.361 [2024-04-26 20:04:28.753065] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:44.361 [2024-04-26 20:04:28.753080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:44.361 #83 NEW cov: 11993 ft: 15449 corp: 35/1048b lim: 90 exec/s: 83 rss: 71Mb L: 59/75 MS: 1 InsertRepeatedBytes- 00:08:44.361 [2024-04-26 20:04:28.792876] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.361 [2024-04-26 20:04:28.792902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.361 [2024-04-26 20:04:28.792942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.361 [2024-04-26 20:04:28.792958] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.620 #84 NEW cov: 11993 ft: 15461 corp: 36/1086b lim: 90 exec/s: 84 rss: 72Mb L: 38/75 MS: 1 CopyPart- 00:08:44.620 [2024-04-26 20:04:28.832839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.620 [2024-04-26 20:04:28.832864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.620 #85 NEW cov: 11993 ft: 15484 corp: 37/1109b lim: 90 exec/s: 85 rss: 72Mb L: 23/75 MS: 1 CMP- DE: "\000\000\000\016"- 00:08:44.620 [2024-04-26 20:04:28.883053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.620 [2024-04-26 20:04:28.883079] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.620 #86 NEW cov: 11993 ft: 15489 corp: 38/1132b lim: 90 exec/s: 86 rss: 72Mb L: 23/75 MS: 1 CrossOver- 00:08:44.620 [2024-04-26 20:04:28.933278] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.620 [2024-04-26 20:04:28.933303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.620 [2024-04-26 20:04:28.933344] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:44.620 [2024-04-26 20:04:28.933361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:44.620 #89 NEW cov: 11993 ft: 15498 corp: 39/1170b lim: 90 exec/s: 89 rss: 72Mb L: 38/75 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:08:44.620 [2024-04-26 20:04:28.973242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.620 [2024-04-26 20:04:28.973268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.620 #90 NEW cov: 11993 ft: 15499 corp: 40/1204b lim: 90 exec/s: 90 rss: 72Mb L: 34/75 MS: 1 CrossOver- 00:08:44.620 [2024-04-26 20:04:29.013332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:44.620 [2024-04-26 20:04:29.013359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:44.620 #91 NEW cov: 11993 ft: 15552 corp: 41/1231b lim: 90 exec/s: 45 rss: 72Mb L: 27/75 MS: 1 InsertRepeatedBytes- 00:08:44.620 #91 DONE cov: 11993 ft: 15552 corp: 41/1231b lim: 90 exec/s: 45 rss: 72Mb 00:08:44.620 ###### Recommended dictionary. ###### 00:08:44.620 "\000\000\000/" # Uses: 3 00:08:44.620 "\325\003\000\000\000\000\000\000" # Uses: 0 00:08:44.620 "\001\000\177#\210\016\354\273" # Uses: 0 00:08:44.620 "\000\000\000\016" # Uses: 0 00:08:44.620 ###### End of recommended dictionary. ###### 00:08:44.620 Done 91 runs in 2 second(s) 00:08:44.879 20:04:29 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_20.conf /var/tmp/suppress_nvmf_fuzz 00:08:44.879 20:04:29 -- ../common.sh@72 -- # (( i++ )) 00:08:44.879 20:04:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.879 20:04:29 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:44.879 20:04:29 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:44.879 20:04:29 -- nvmf/run.sh@24 -- # local timen=1 00:08:44.879 20:04:29 -- nvmf/run.sh@25 -- # local core=0x1 00:08:44.879 20:04:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:44.879 20:04:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:44.879 20:04:29 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:44.879 20:04:29 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:44.879 20:04:29 -- nvmf/run.sh@34 -- # printf %02d 21 00:08:44.879 20:04:29 -- nvmf/run.sh@34 -- # port=4421 00:08:44.879 20:04:29 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:44.879 20:04:29 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:44.879 20:04:29 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:44.879 20:04:29 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:44.879 20:04:29 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:44.879 20:04:29 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 00:08:44.879 [2024-04-26 20:04:29.211757] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:44.879 [2024-04-26 20:04:29.211813] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628403 ] 00:08:44.879 EAL: No free 2048 kB hugepages reported on node 1 00:08:45.138 [2024-04-26 20:04:29.400052] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:45.138 [2024-04-26 20:04:29.469704] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.138 [2024-04-26 20:04:29.529116] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:45.138 [2024-04-26 20:04:29.545321] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:45.138 INFO: Running with entropic power schedule (0xFF, 100). 00:08:45.138 INFO: Seed: 157064714 00:08:45.138 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:45.138 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:45.138 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:45.138 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.138 #2 INITED exec/s: 0 rss: 63Mb 00:08:45.138 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.138 This may also happen if the target rejected all inputs we tried so far 00:08:45.396 [2024-04-26 20:04:29.600613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.396 [2024-04-26 20:04:29.600645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.396 [2024-04-26 20:04:29.600700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.396 [2024-04-26 20:04:29.600717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.655 NEW_FUNC[1/672]: 0x4a71d0 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:45.655 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:45.655 #6 NEW cov: 11724 ft: 11725 corp: 2/26b lim: 50 exec/s: 0 rss: 69Mb L: 25/25 MS: 4 ChangeBinInt-InsertByte-CopyPart-InsertRepeatedBytes- 00:08:45.655 [2024-04-26 20:04:29.931442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.655 [2024-04-26 20:04:29.931479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.655 [2024-04-26 20:04:29.931539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.655 [2024-04-26 20:04:29.931556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.655 #13 NEW cov: 11854 ft: 12273 corp: 3/53b lim: 50 exec/s: 0 rss: 69Mb L: 27/27 MS: 2 InsertByte-CrossOver- 00:08:45.655 [2024-04-26 20:04:29.971478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.655 [2024-04-26 20:04:29.971506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.655 [2024-04-26 20:04:29.971563] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.655 [2024-04-26 20:04:29.971580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.655 #14 NEW cov: 11860 ft: 12475 corp: 4/82b lim: 50 exec/s: 0 rss: 70Mb L: 29/29 MS: 1 InsertRepeatedBytes- 00:08:45.655 [2024-04-26 20:04:30.011653] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.655 [2024-04-26 20:04:30.011683] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.655 [2024-04-26 20:04:30.011740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.655 [2024-04-26 20:04:30.011758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.655 #15 NEW cov: 11945 ft: 12796 corp: 5/109b lim: 50 exec/s: 0 rss: 70Mb L: 27/29 MS: 1 CrossOver- 00:08:45.655 [2024-04-26 20:04:30.061818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.655 [2024-04-26 20:04:30.061853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.655 [2024-04-26 20:04:30.061914] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.655 [2024-04-26 20:04:30.061931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.655 #16 NEW cov: 11945 ft: 12917 corp: 6/135b lim: 50 exec/s: 0 rss: 70Mb L: 26/29 MS: 1 InsertByte- 00:08:45.913 [2024-04-26 20:04:30.111952] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.111983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.112033] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.112050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 #22 NEW cov: 11945 ft: 13002 corp: 7/156b lim: 50 exec/s: 0 rss: 70Mb L: 21/29 MS: 1 EraseBytes- 00:08:45.913 [2024-04-26 20:04:30.162184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.162212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.162251] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.162267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.162322] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.913 [2024-04-26 20:04:30.162338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:45.913 #23 NEW cov: 11945 ft: 13371 corp: 8/187b lim: 50 exec/s: 0 rss: 70Mb L: 31/31 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:45.913 [2024-04-26 20:04:30.202156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.202183] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.202219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.202235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 #29 NEW cov: 11945 ft: 13520 corp: 9/216b lim: 50 exec/s: 0 rss: 70Mb L: 29/31 MS: 1 CopyPart- 00:08:45.913 [2024-04-26 20:04:30.252285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.252312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.252350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.252365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 #30 NEW cov: 11945 ft: 13554 corp: 10/238b lim: 50 exec/s: 0 rss: 70Mb L: 22/31 MS: 1 EraseBytes- 00:08:45.913 [2024-04-26 20:04:30.292422] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.292448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.292485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.292500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 #31 NEW cov: 11945 ft: 13594 corp: 11/265b lim: 50 exec/s: 0 rss: 70Mb L: 27/31 MS: 1 ChangeBit- 00:08:45.913 [2024-04-26 20:04:30.332670] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:45.913 [2024-04-26 20:04:30.332696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.332732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:45.913 [2024-04-26 20:04:30.332748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:45.913 [2024-04-26 20:04:30.332805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:45.913 [2024-04-26 20:04:30.332821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.172 #32 NEW cov: 11945 ft: 13680 corp: 12/300b lim: 50 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:46.172 [2024-04-26 20:04:30.382691] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.382717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.382753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.382769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.172 #33 NEW cov: 11945 ft: 13707 corp: 13/327b lim: 50 exec/s: 0 rss: 70Mb L: 27/35 MS: 1 ChangeByte- 00:08:46.172 [2024-04-26 20:04:30.422796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.422822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.422862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.422882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.172 #34 NEW cov: 11945 ft: 13746 corp: 14/351b lim: 50 exec/s: 0 rss: 70Mb L: 24/35 MS: 1 EraseBytes- 00:08:46.172 [2024-04-26 20:04:30.473104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.473148] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.473185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.473202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.473257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.172 [2024-04-26 20:04:30.473273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.172 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.172 #35 NEW cov: 11968 ft: 13802 corp: 15/386b lim: 50 exec/s: 0 rss: 70Mb L: 35/35 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:46.172 [2024-04-26 20:04:30.513063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.513092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.513137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.513156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.172 #36 NEW cov: 11968 ft: 13850 corp: 16/415b lim: 50 exec/s: 0 rss: 70Mb L: 29/35 MS: 1 CrossOver- 00:08:46.172 [2024-04-26 20:04:30.553166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.553193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.553231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.553247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.172 #37 NEW cov: 11968 ft: 13873 corp: 17/440b lim: 50 exec/s: 37 rss: 71Mb L: 25/35 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:46.172 [2024-04-26 20:04:30.603291] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.172 [2024-04-26 20:04:30.603318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.172 [2024-04-26 20:04:30.603386] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.172 [2024-04-26 20:04:30.603401] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.430 #38 NEW cov: 11968 ft: 13950 corp: 18/463b lim: 50 exec/s: 38 rss: 71Mb L: 23/35 MS: 1 EraseBytes- 00:08:46.430 [2024-04-26 20:04:30.643592] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.430 [2024-04-26 20:04:30.643619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.430 [2024-04-26 20:04:30.643666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.430 [2024-04-26 20:04:30.643682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.430 [2024-04-26 20:04:30.643738] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.430 [2024-04-26 20:04:30.643754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.430 #39 NEW cov: 11968 ft: 13957 corp: 19/500b lim: 50 exec/s: 39 rss: 71Mb L: 37/37 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:46.430 [2024-04-26 20:04:30.683543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.430 [2024-04-26 20:04:30.683570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.430 [2024-04-26 20:04:30.683608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.430 [2024-04-26 20:04:30.683623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.430 #40 NEW cov: 11968 ft: 13979 corp: 20/529b lim: 50 exec/s: 40 rss: 71Mb L: 29/37 MS: 1 CrossOver- 00:08:46.430 [2024-04-26 20:04:30.723846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.431 [2024-04-26 20:04:30.723877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.723930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.431 [2024-04-26 20:04:30.723946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.724003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.431 [2024-04-26 20:04:30.724022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.431 #41 NEW cov: 11968 ft: 14008 corp: 21/567b lim: 50 exec/s: 41 rss: 71Mb L: 38/38 MS: 1 InsertByte- 00:08:46.431 [2024-04-26 20:04:30.773787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.431 [2024-04-26 20:04:30.773814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.773868] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.431 [2024-04-26 20:04:30.773891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.431 #42 NEW cov: 11968 ft: 14042 corp: 22/591b lim: 50 exec/s: 42 rss: 71Mb L: 24/38 MS: 1 ChangeBit- 00:08:46.431 [2024-04-26 20:04:30.814042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.431 [2024-04-26 20:04:30.814070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.814116] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.431 [2024-04-26 20:04:30.814131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.814185] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.431 [2024-04-26 20:04:30.814201] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.431 #43 NEW cov: 11968 ft: 14056 corp: 23/621b lim: 50 exec/s: 43 rss: 71Mb L: 30/38 MS: 1 InsertByte- 00:08:46.431 [2024-04-26 20:04:30.853722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.431 [2024-04-26 20:04:30.853749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.431 [2024-04-26 20:04:30.853804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.431 [2024-04-26 20:04:30.853820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.688 #44 NEW cov: 11968 ft: 14080 corp: 24/650b lim: 50 exec/s: 44 rss: 71Mb L: 29/38 MS: 1 ChangeByte- 00:08:46.688 [2024-04-26 20:04:30.894137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.688 [2024-04-26 20:04:30.894165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:30.894218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.688 [2024-04-26 20:04:30.894235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.688 #45 NEW cov: 11968 ft: 14101 corp: 25/675b lim: 50 exec/s: 45 rss: 71Mb L: 25/38 MS: 1 ChangeBinInt- 00:08:46.688 [2024-04-26 20:04:30.934393] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.688 [2024-04-26 20:04:30.934420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:30.934461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.688 [2024-04-26 20:04:30.934476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:30.934531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.688 [2024-04-26 20:04:30.934549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.688 #46 NEW cov: 11968 ft: 14176 corp: 26/712b lim: 50 exec/s: 46 rss: 71Mb L: 37/38 MS: 1 ChangeBit- 00:08:46.688 [2024-04-26 20:04:30.974360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.688 [2024-04-26 20:04:30.974389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:30.974444] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.688 [2024-04-26 20:04:30.974460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.688 #47 NEW cov: 11968 ft: 14193 corp: 27/736b lim: 50 exec/s: 47 rss: 71Mb L: 24/38 MS: 1 ChangeBinInt- 00:08:46.688 [2024-04-26 20:04:31.014780] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.688 [2024-04-26 20:04:31.014807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:31.014848] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.688 [2024-04-26 20:04:31.014864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:31.014942] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.688 [2024-04-26 20:04:31.014959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.688 [2024-04-26 20:04:31.015014] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:46.688 [2024-04-26 20:04:31.015028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:46.688 #48 NEW cov: 11968 ft: 14534 corp: 28/783b lim: 50 exec/s: 48 rss: 71Mb L: 47/47 MS: 1 CopyPart- 00:08:46.688 [2024-04-26 20:04:31.064481] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.688 [2024-04-26 20:04:31.064508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.688 #49 NEW cov: 11968 ft: 15305 corp: 29/800b lim: 50 exec/s: 49 rss: 72Mb L: 17/47 MS: 1 CrossOver- 00:08:46.689 [2024-04-26 20:04:31.114926] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.689 [2024-04-26 20:04:31.114953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.689 [2024-04-26 20:04:31.114991] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.689 [2024-04-26 20:04:31.115007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.689 [2024-04-26 20:04:31.115066] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.689 [2024-04-26 20:04:31.115082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.946 #55 NEW cov: 11968 ft: 15326 corp: 30/836b lim: 50 exec/s: 55 rss: 72Mb L: 36/47 MS: 1 EraseBytes- 00:08:46.947 [2024-04-26 20:04:31.164938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.164965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.165004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.165023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 #56 NEW cov: 11968 ft: 15337 corp: 31/860b lim: 50 exec/s: 56 rss: 72Mb L: 24/47 MS: 1 ShuffleBytes- 00:08:46.947 [2024-04-26 20:04:31.205085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.205112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.205150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.205167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 #57 NEW cov: 11968 ft: 15348 corp: 32/883b lim: 50 exec/s: 57 rss: 72Mb L: 23/47 MS: 1 ChangeBit- 00:08:46.947 [2024-04-26 20:04:31.245150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.245177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.245214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.245230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 #58 NEW cov: 11968 ft: 15352 corp: 33/907b lim: 50 exec/s: 58 rss: 72Mb L: 24/47 MS: 1 ShuffleBytes- 00:08:46.947 [2024-04-26 20:04:31.285231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.285257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.285295] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.285311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 #59 NEW cov: 11968 ft: 15358 corp: 34/931b lim: 50 exec/s: 59 rss: 72Mb L: 24/47 MS: 1 ChangeBit- 00:08:46.947 [2024-04-26 20:04:31.325512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.325538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.325600] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.325616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.325672] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.947 [2024-04-26 20:04:31.325687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:46.947 #60 NEW cov: 11968 ft: 15366 corp: 35/966b lim: 50 exec/s: 60 rss: 72Mb L: 35/47 MS: 1 CrossOver- 00:08:46.947 [2024-04-26 20:04:31.365656] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:46.947 [2024-04-26 20:04:31.365682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.365735] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:46.947 [2024-04-26 20:04:31.365751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:46.947 [2024-04-26 20:04:31.365807] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:46.947 [2024-04-26 20:04:31.365823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.205 #61 NEW cov: 11968 ft: 15372 corp: 36/998b lim: 50 exec/s: 61 rss: 72Mb L: 32/47 MS: 1 CrossOver- 00:08:47.205 [2024-04-26 20:04:31.405606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.205 [2024-04-26 20:04:31.405632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.205 [2024-04-26 20:04:31.405687] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.205 [2024-04-26 20:04:31.405704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.205 #62 NEW cov: 11968 ft: 15381 corp: 37/1021b lim: 50 exec/s: 62 rss: 72Mb L: 23/47 MS: 1 ChangeByte- 00:08:47.205 [2024-04-26 20:04:31.445707] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.205 [2024-04-26 20:04:31.445734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.205 [2024-04-26 20:04:31.445787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.205 [2024-04-26 20:04:31.445803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.205 #63 NEW cov: 11968 ft: 15438 corp: 38/1048b lim: 50 exec/s: 63 rss: 72Mb L: 27/47 MS: 1 ChangeByte- 00:08:47.205 [2024-04-26 20:04:31.485755] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.205 [2024-04-26 20:04:31.485780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.205 [2024-04-26 20:04:31.485832] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.205 [2024-04-26 20:04:31.485848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.206 #64 NEW cov: 11968 ft: 15442 corp: 39/1072b lim: 50 exec/s: 64 rss: 72Mb L: 24/47 MS: 1 ShuffleBytes- 00:08:47.206 [2024-04-26 20:04:31.526183] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.206 [2024-04-26 20:04:31.526210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.206 [2024-04-26 20:04:31.526260] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.206 [2024-04-26 20:04:31.526276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.206 [2024-04-26 20:04:31.526331] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:47.206 [2024-04-26 20:04:31.526347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:47.206 [2024-04-26 20:04:31.526404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:47.206 [2024-04-26 20:04:31.526420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:47.206 #65 NEW cov: 11968 ft: 15449 corp: 40/1118b lim: 50 exec/s: 65 rss: 72Mb L: 46/47 MS: 1 InsertRepeatedBytes- 00:08:47.206 [2024-04-26 20:04:31.576041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:47.206 [2024-04-26 20:04:31.576068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.206 [2024-04-26 20:04:31.576130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:47.206 [2024-04-26 20:04:31.576146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:47.206 #66 NEW cov: 11968 ft: 15451 corp: 41/1145b lim: 50 exec/s: 33 rss: 72Mb L: 27/47 MS: 1 ChangeBinInt- 00:08:47.206 #66 DONE cov: 11968 ft: 15451 corp: 41/1145b lim: 50 exec/s: 33 rss: 72Mb 00:08:47.206 ###### Recommended dictionary. ###### 00:08:47.206 "\377\377\377\377" # Uses: 2 00:08:47.206 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:47.206 ###### End of recommended dictionary. ###### 00:08:47.206 Done 66 runs in 2 second(s) 00:08:47.464 20:04:31 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_21.conf /var/tmp/suppress_nvmf_fuzz 00:08:47.464 20:04:31 -- ../common.sh@72 -- # (( i++ )) 00:08:47.464 20:04:31 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.464 20:04:31 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:47.464 20:04:31 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:47.464 20:04:31 -- nvmf/run.sh@24 -- # local timen=1 00:08:47.464 20:04:31 -- nvmf/run.sh@25 -- # local core=0x1 00:08:47.464 20:04:31 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.464 20:04:31 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:47.464 20:04:31 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:47.464 20:04:31 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:47.464 20:04:31 -- nvmf/run.sh@34 -- # printf %02d 22 00:08:47.464 20:04:31 -- nvmf/run.sh@34 -- # port=4422 00:08:47.464 20:04:31 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.464 20:04:31 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:47.464 20:04:31 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:47.464 20:04:31 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:47.464 20:04:31 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:47.464 20:04:31 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 00:08:47.464 [2024-04-26 20:04:31.784556] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:47.464 [2024-04-26 20:04:31.784630] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1628757 ] 00:08:47.464 EAL: No free 2048 kB hugepages reported on node 1 00:08:47.723 [2024-04-26 20:04:31.983472] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:47.723 [2024-04-26 20:04:32.054842] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.723 [2024-04-26 20:04:32.114175] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:47.723 [2024-04-26 20:04:32.130375] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:47.723 INFO: Running with entropic power schedule (0xFF, 100). 00:08:47.723 INFO: Seed: 2741084614 00:08:47.981 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:47.981 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:47.981 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:47.981 INFO: A corpus is not provided, starting from an empty corpus 00:08:47.981 #2 INITED exec/s: 0 rss: 63Mb 00:08:47.981 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:47.981 This may also happen if the target rejected all inputs we tried so far 00:08:47.981 [2024-04-26 20:04:32.197504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:47.981 [2024-04-26 20:04:32.197548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:47.981 [2024-04-26 20:04:32.197635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:47.981 [2024-04-26 20:04:32.197659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.240 NEW_FUNC[1/672]: 0x4a9490 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:48.240 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:48.240 #15 NEW cov: 11750 ft: 11751 corp: 2/47b lim: 85 exec/s: 0 rss: 69Mb L: 46/46 MS: 3 CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:48.240 [2024-04-26 20:04:32.548398] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.240 [2024-04-26 20:04:32.548449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.240 [2024-04-26 20:04:32.548544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.240 [2024-04-26 20:04:32.548569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.240 [2024-04-26 20:04:32.548671] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.240 [2024-04-26 20:04:32.548695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.240 #17 NEW cov: 11880 ft: 12684 corp: 3/109b lim: 85 exec/s: 0 rss: 69Mb L: 62/62 MS: 2 InsertRepeatedBytes-InsertRepeatedBytes- 00:08:48.240 [2024-04-26 20:04:32.597943] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.240 [2024-04-26 20:04:32.597973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.240 #18 NEW cov: 11886 ft: 13635 corp: 4/127b lim: 85 exec/s: 0 rss: 69Mb L: 18/62 MS: 1 CrossOver- 00:08:48.240 [2024-04-26 20:04:32.658792] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.240 [2024-04-26 20:04:32.658820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.240 [2024-04-26 20:04:32.658888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.240 [2024-04-26 20:04:32.658906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.240 [2024-04-26 20:04:32.658965] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.240 [2024-04-26 20:04:32.658981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.498 #19 NEW cov: 11971 ft: 13988 corp: 5/189b lim: 85 exec/s: 0 rss: 69Mb L: 62/62 MS: 1 ShuffleBytes- 00:08:48.498 [2024-04-26 20:04:32.718998] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.498 [2024-04-26 20:04:32.719024] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.498 [2024-04-26 20:04:32.719081] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.498 [2024-04-26 20:04:32.719099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.498 [2024-04-26 20:04:32.719168] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.499 [2024-04-26 20:04:32.719187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.499 #25 NEW cov: 11971 ft: 14055 corp: 6/251b lim: 85 exec/s: 0 rss: 69Mb L: 62/62 MS: 1 CopyPart- 00:08:48.499 [2024-04-26 20:04:32.779164] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.499 [2024-04-26 20:04:32.779198] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.779284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.499 [2024-04-26 20:04:32.779302] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.779407] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.499 [2024-04-26 20:04:32.779425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.499 #26 NEW cov: 11971 ft: 14121 corp: 7/313b lim: 85 exec/s: 0 rss: 69Mb L: 62/62 MS: 1 ChangeBinInt- 00:08:48.499 [2024-04-26 20:04:32.829795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.499 [2024-04-26 20:04:32.829825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.829909] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.499 [2024-04-26 20:04:32.829929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.829997] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.499 [2024-04-26 20:04:32.830016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.830113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.499 [2024-04-26 20:04:32.830129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.499 #27 NEW cov: 11971 ft: 14489 corp: 8/383b lim: 85 exec/s: 0 rss: 69Mb L: 70/70 MS: 1 CMP- DE: "\377\377\377\377\377\377\377\377"- 00:08:48.499 [2024-04-26 20:04:32.879899] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.499 [2024-04-26 20:04:32.879927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.880015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.499 [2024-04-26 20:04:32.880034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.880099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.499 [2024-04-26 20:04:32.880118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.880208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.499 [2024-04-26 20:04:32.880225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.499 #28 NEW cov: 11971 ft: 14576 corp: 9/453b lim: 85 exec/s: 0 rss: 70Mb L: 70/70 MS: 1 ChangeBinInt- 00:08:48.499 [2024-04-26 20:04:32.939866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.499 [2024-04-26 20:04:32.939898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.939954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.499 [2024-04-26 20:04:32.939974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.499 [2024-04-26 20:04:32.940037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.499 [2024-04-26 20:04:32.940056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.758 #29 NEW cov: 11971 ft: 14639 corp: 10/507b lim: 85 exec/s: 0 rss: 70Mb L: 54/70 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:48.758 [2024-04-26 20:04:32.989648] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.758 [2024-04-26 20:04:32.989675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.758 [2024-04-26 20:04:32.989733] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.758 [2024-04-26 20:04:32.989750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.758 #30 NEW cov: 11971 ft: 14697 corp: 11/553b lim: 85 exec/s: 0 rss: 70Mb L: 46/70 MS: 1 ChangeByte- 00:08:48.758 [2024-04-26 20:04:33.040493] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.758 [2024-04-26 20:04:33.040522] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.758 [2024-04-26 20:04:33.040611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:48.758 [2024-04-26 20:04:33.040630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:48.758 [2024-04-26 20:04:33.040686] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:48.758 [2024-04-26 20:04:33.040705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:48.758 [2024-04-26 20:04:33.040795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:48.758 [2024-04-26 20:04:33.040814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:48.758 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:48.758 #31 NEW cov: 11994 ft: 14751 corp: 12/636b lim: 85 exec/s: 0 rss: 70Mb L: 83/83 MS: 1 CopyPart- 00:08:48.758 [2024-04-26 20:04:33.099758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.758 [2024-04-26 20:04:33.099790] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.758 #32 NEW cov: 11994 ft: 14775 corp: 13/654b lim: 85 exec/s: 0 rss: 70Mb L: 18/83 MS: 1 CopyPart- 00:08:48.758 [2024-04-26 20:04:33.159992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:48.758 [2024-04-26 20:04:33.160023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:48.758 #33 NEW cov: 11994 ft: 14817 corp: 14/672b lim: 85 exec/s: 33 rss: 70Mb L: 18/83 MS: 1 ChangeByte- 00:08:49.016 [2024-04-26 20:04:33.220224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-04-26 20:04:33.220254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #34 NEW cov: 11994 ft: 14839 corp: 15/690b lim: 85 exec/s: 34 rss: 70Mb L: 18/83 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:49.016 [2024-04-26 20:04:33.280363] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-04-26 20:04:33.280393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 #35 NEW cov: 11994 ft: 14867 corp: 16/708b lim: 85 exec/s: 35 rss: 70Mb L: 18/83 MS: 1 CrossOver- 00:08:49.016 [2024-04-26 20:04:33.341820] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-04-26 20:04:33.341848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.341937] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.016 [2024-04-26 20:04:33.341957] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.342027] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.016 [2024-04-26 20:04:33.342042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.342134] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.016 [2024-04-26 20:04:33.342152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.342239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:4 nsid:0 00:08:49.016 [2024-04-26 20:04:33.342258] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:49.016 #36 NEW cov: 11994 ft: 14920 corp: 17/793b lim: 85 exec/s: 36 rss: 70Mb L: 85/85 MS: 1 CrossOver- 00:08:49.016 [2024-04-26 20:04:33.401463] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-04-26 20:04:33.401491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.401552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.016 [2024-04-26 20:04:33.401568] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.016 [2024-04-26 20:04:33.401638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.016 [2024-04-26 20:04:33.401654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.016 #37 NEW cov: 11994 ft: 14960 corp: 18/855b lim: 85 exec/s: 37 rss: 70Mb L: 62/85 MS: 1 ChangeBit- 00:08:49.016 [2024-04-26 20:04:33.450911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.016 [2024-04-26 20:04:33.450939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.274 #38 NEW cov: 11994 ft: 14967 corp: 19/873b lim: 85 exec/s: 38 rss: 70Mb L: 18/85 MS: 1 ShuffleBytes- 00:08:49.274 [2024-04-26 20:04:33.511750] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.274 [2024-04-26 20:04:33.511778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.511846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.274 [2024-04-26 20:04:33.511866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.511936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.274 [2024-04-26 20:04:33.511953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.274 #39 NEW cov: 11994 ft: 14989 corp: 20/927b lim: 85 exec/s: 39 rss: 70Mb L: 54/85 MS: 1 ChangeBit- 00:08:49.274 [2024-04-26 20:04:33.561886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.274 [2024-04-26 20:04:33.561916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.561993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.274 [2024-04-26 20:04:33.562013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.562080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.274 [2024-04-26 20:04:33.562097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.274 #40 NEW cov: 11994 ft: 15021 corp: 21/989b lim: 85 exec/s: 40 rss: 70Mb L: 62/85 MS: 1 ChangeByte- 00:08:49.274 [2024-04-26 20:04:33.612437] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.274 [2024-04-26 20:04:33.612465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.612543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.274 [2024-04-26 20:04:33.612563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.612627] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.274 [2024-04-26 20:04:33.612644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.612731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.274 [2024-04-26 20:04:33.612751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.274 #41 NEW cov: 11994 ft: 15040 corp: 22/1072b lim: 85 exec/s: 41 rss: 71Mb L: 83/85 MS: 1 ChangeBit- 00:08:49.274 [2024-04-26 20:04:33.672266] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.274 [2024-04-26 20:04:33.672294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.672385] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.274 [2024-04-26 20:04:33.672404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.274 [2024-04-26 20:04:33.672458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.274 [2024-04-26 20:04:33.672476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.274 #42 NEW cov: 11994 ft: 15073 corp: 23/1137b lim: 85 exec/s: 42 rss: 71Mb L: 65/85 MS: 1 CrossOver- 00:08:49.532 [2024-04-26 20:04:33.721851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.532 [2024-04-26 20:04:33.721888] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.532 #45 NEW cov: 11994 ft: 15096 corp: 24/1158b lim: 85 exec/s: 45 rss: 71Mb L: 21/85 MS: 3 ChangeBit-InsertRepeatedBytes-PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:49.532 [2024-04-26 20:04:33.772741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.532 [2024-04-26 20:04:33.772772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.532 [2024-04-26 20:04:33.772877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.532 [2024-04-26 20:04:33.772899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.532 [2024-04-26 20:04:33.772999] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.532 [2024-04-26 20:04:33.773018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.532 #46 NEW cov: 11994 ft: 15107 corp: 25/1220b lim: 85 exec/s: 46 rss: 71Mb L: 62/85 MS: 1 ChangeBit- 00:08:49.532 [2024-04-26 20:04:33.832152] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.533 [2024-04-26 20:04:33.832179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.533 #47 NEW cov: 11994 ft: 15119 corp: 26/1238b lim: 85 exec/s: 47 rss: 71Mb L: 18/85 MS: 1 CopyPart- 00:08:49.533 [2024-04-26 20:04:33.883003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.533 [2024-04-26 20:04:33.883030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.533 [2024-04-26 20:04:33.883090] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.533 [2024-04-26 20:04:33.883108] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.533 [2024-04-26 20:04:33.883165] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.533 [2024-04-26 20:04:33.883182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.533 #48 NEW cov: 11994 ft: 15136 corp: 27/1301b lim: 85 exec/s: 48 rss: 71Mb L: 63/85 MS: 1 InsertByte- 00:08:49.533 [2024-04-26 20:04:33.942867] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.533 [2024-04-26 20:04:33.942899] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.533 [2024-04-26 20:04:33.942985] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.533 [2024-04-26 20:04:33.943001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.533 #49 NEW cov: 11994 ft: 15161 corp: 28/1337b lim: 85 exec/s: 49 rss: 71Mb L: 36/85 MS: 1 CopyPart- 00:08:49.792 [2024-04-26 20:04:33.993346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.792 [2024-04-26 20:04:33.993374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:33.993451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.792 [2024-04-26 20:04:33.993468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:33.993524] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.792 [2024-04-26 20:04:33.993538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.792 #50 NEW cov: 11994 ft: 15172 corp: 29/1400b lim: 85 exec/s: 50 rss: 71Mb L: 63/85 MS: 1 ChangeByte- 00:08:49.792 [2024-04-26 20:04:34.053882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.792 [2024-04-26 20:04:34.053908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.053978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.792 [2024-04-26 20:04:34.054001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.054053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.792 [2024-04-26 20:04:34.054069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.054151] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.792 [2024-04-26 20:04:34.054170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.792 #51 NEW cov: 11994 ft: 15248 corp: 30/1470b lim: 85 exec/s: 51 rss: 72Mb L: 70/85 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:08:49.792 [2024-04-26 20:04:34.113815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.792 [2024-04-26 20:04:34.113842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.113921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.792 [2024-04-26 20:04:34.113941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.114009] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.792 [2024-04-26 20:04:34.114027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.792 #52 NEW cov: 11994 ft: 15253 corp: 31/1533b lim: 85 exec/s: 52 rss: 72Mb L: 63/85 MS: 1 InsertByte- 00:08:49.792 [2024-04-26 20:04:34.164323] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:49.792 [2024-04-26 20:04:34.164352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.164421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:49.792 [2024-04-26 20:04:34.164440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.164505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:49.792 [2024-04-26 20:04:34.164526] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:49.792 [2024-04-26 20:04:34.164613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:49.792 [2024-04-26 20:04:34.164633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:49.792 #53 NEW cov: 11994 ft: 15262 corp: 32/1615b lim: 85 exec/s: 26 rss: 72Mb L: 82/85 MS: 1 CopyPart- 00:08:49.792 #53 DONE cov: 11994 ft: 15262 corp: 32/1615b lim: 85 exec/s: 26 rss: 72Mb 00:08:49.792 ###### Recommended dictionary. ###### 00:08:49.792 "\377\377\377\377\377\377\377\377" # Uses: 4 00:08:49.792 ###### End of recommended dictionary. ###### 00:08:49.792 Done 53 runs in 2 second(s) 00:08:50.051 20:04:34 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_22.conf /var/tmp/suppress_nvmf_fuzz 00:08:50.051 20:04:34 -- ../common.sh@72 -- # (( i++ )) 00:08:50.051 20:04:34 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:50.051 20:04:34 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:50.051 20:04:34 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:50.051 20:04:34 -- nvmf/run.sh@24 -- # local timen=1 00:08:50.051 20:04:34 -- nvmf/run.sh@25 -- # local core=0x1 00:08:50.051 20:04:34 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.051 20:04:34 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:50.051 20:04:34 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:50.051 20:04:34 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:50.051 20:04:34 -- nvmf/run.sh@34 -- # printf %02d 23 00:08:50.051 20:04:34 -- nvmf/run.sh@34 -- # port=4423 00:08:50.051 20:04:34 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.051 20:04:34 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:50.051 20:04:34 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:50.051 20:04:34 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:50.051 20:04:34 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:50.051 20:04:34 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 00:08:50.051 [2024-04-26 20:04:34.357695] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:50.051 [2024-04-26 20:04:34.357749] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629114 ] 00:08:50.051 EAL: No free 2048 kB hugepages reported on node 1 00:08:50.310 [2024-04-26 20:04:34.554639] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:50.310 [2024-04-26 20:04:34.625526] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.310 [2024-04-26 20:04:34.684828] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:50.310 [2024-04-26 20:04:34.701016] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:50.310 INFO: Running with entropic power schedule (0xFF, 100). 00:08:50.310 INFO: Seed: 1016087385 00:08:50.310 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:50.310 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:50.310 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:50.310 INFO: A corpus is not provided, starting from an empty corpus 00:08:50.310 #2 INITED exec/s: 0 rss: 62Mb 00:08:50.310 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:50.310 This may also happen if the target rejected all inputs we tried so far 00:08:50.310 [2024-04-26 20:04:34.750533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.310 [2024-04-26 20:04:34.750564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.310 [2024-04-26 20:04:34.750608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.310 [2024-04-26 20:04:34.750624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.310 [2024-04-26 20:04:34.750676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.310 [2024-04-26 20:04:34.750691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.310 [2024-04-26 20:04:34.750744] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.310 [2024-04-26 20:04:34.750759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.310 [2024-04-26 20:04:34.750813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:50.310 [2024-04-26 20:04:34.750828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.827 NEW_FUNC[1/671]: 0x4ac6c0 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:50.827 NEW_FUNC[2/671]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:50.827 #18 NEW cov: 11681 ft: 11684 corp: 2/26b lim: 25 exec/s: 0 rss: 69Mb L: 25/25 MS: 1 InsertRepeatedBytes- 00:08:50.827 [2024-04-26 20:04:35.070939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.828 [2024-04-26 20:04:35.070974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.828 #23 NEW cov: 11813 ft: 12858 corp: 3/32b lim: 25 exec/s: 0 rss: 69Mb L: 6/25 MS: 5 ChangeBit-InsertByte-CrossOver-ShuffleBytes-CopyPart- 00:08:50.828 [2024-04-26 20:04:35.110911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.828 [2024-04-26 20:04:35.110939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.828 #24 NEW cov: 11819 ft: 13136 corp: 4/38b lim: 25 exec/s: 0 rss: 70Mb L: 6/25 MS: 1 ChangeByte- 00:08:50.828 [2024-04-26 20:04:35.151039] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.828 [2024-04-26 20:04:35.151066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.828 #25 NEW cov: 11904 ft: 13449 corp: 5/44b lim: 25 exec/s: 0 rss: 70Mb L: 6/25 MS: 1 ChangeBit- 00:08:50.828 [2024-04-26 20:04:35.191638] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.828 [2024-04-26 20:04:35.191666] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.828 [2024-04-26 20:04:35.191720] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:50.828 [2024-04-26 20:04:35.191735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:50.828 [2024-04-26 20:04:35.191790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:50.828 [2024-04-26 20:04:35.191806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:50.828 [2024-04-26 20:04:35.191859] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:50.828 [2024-04-26 20:04:35.191878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:50.828 [2024-04-26 20:04:35.191935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:50.828 [2024-04-26 20:04:35.191950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:50.828 #26 NEW cov: 11904 ft: 13576 corp: 6/69b lim: 25 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 ChangeBit- 00:08:50.828 [2024-04-26 20:04:35.241314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:50.828 [2024-04-26 20:04:35.241341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:50.828 #27 NEW cov: 11904 ft: 13654 corp: 7/74b lim: 25 exec/s: 0 rss: 70Mb L: 5/25 MS: 1 EraseBytes- 00:08:51.087 [2024-04-26 20:04:35.291803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.291831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.291886] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.087 [2024-04-26 20:04:35.291917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.291974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.087 [2024-04-26 20:04:35.291990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.292044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.087 [2024-04-26 20:04:35.292060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.087 #30 NEW cov: 11904 ft: 13737 corp: 8/96b lim: 25 exec/s: 0 rss: 70Mb L: 22/25 MS: 3 ChangeBit-ChangeByte-InsertRepeatedBytes- 00:08:51.087 [2024-04-26 20:04:35.332097] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.332124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.332197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.087 [2024-04-26 20:04:35.332213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.332269] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.087 [2024-04-26 20:04:35.332285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.332340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.087 [2024-04-26 20:04:35.332355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.332409] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.087 [2024-04-26 20:04:35.332425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.087 #31 NEW cov: 11904 ft: 13842 corp: 9/121b lim: 25 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 ChangeBit- 00:08:51.087 [2024-04-26 20:04:35.371948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.371974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.372037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.087 [2024-04-26 20:04:35.372053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.372109] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.087 [2024-04-26 20:04:35.372124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.087 #37 NEW cov: 11904 ft: 14121 corp: 10/139b lim: 25 exec/s: 0 rss: 70Mb L: 18/25 MS: 1 EraseBytes- 00:08:51.087 [2024-04-26 20:04:35.422538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.422564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.422635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.087 [2024-04-26 20:04:35.422650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.422704] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.087 [2024-04-26 20:04:35.422720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.422776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.087 [2024-04-26 20:04:35.422791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.087 [2024-04-26 20:04:35.422846] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.087 [2024-04-26 20:04:35.422862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.087 #38 NEW cov: 11913 ft: 14276 corp: 11/164b lim: 25 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 ChangeByte- 00:08:51.087 [2024-04-26 20:04:35.461957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.461982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 #39 NEW cov: 11913 ft: 14350 corp: 12/170b lim: 25 exec/s: 0 rss: 70Mb L: 6/25 MS: 1 ShuffleBytes- 00:08:51.087 [2024-04-26 20:04:35.502043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.087 [2024-04-26 20:04:35.502069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.087 #40 NEW cov: 11913 ft: 14406 corp: 13/177b lim: 25 exec/s: 0 rss: 70Mb L: 7/25 MS: 1 InsertByte- 00:08:51.346 [2024-04-26 20:04:35.542616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.542643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.542718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.346 [2024-04-26 20:04:35.542733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.542786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.346 [2024-04-26 20:04:35.542802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.542855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.346 [2024-04-26 20:04:35.542870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.542932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.346 [2024-04-26 20:04:35.542960] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.346 #41 NEW cov: 11913 ft: 14437 corp: 14/202b lim: 25 exec/s: 0 rss: 70Mb L: 25/25 MS: 1 ChangeBit- 00:08:51.346 [2024-04-26 20:04:35.582252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.582279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.346 #42 NEW cov: 11913 ft: 14452 corp: 15/209b lim: 25 exec/s: 0 rss: 70Mb L: 7/25 MS: 1 InsertByte- 00:08:51.346 [2024-04-26 20:04:35.632642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.632668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.632731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.346 [2024-04-26 20:04:35.632747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.632802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.346 [2024-04-26 20:04:35.632817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.346 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:51.346 #48 NEW cov: 11936 ft: 14524 corp: 16/226b lim: 25 exec/s: 0 rss: 71Mb L: 17/25 MS: 1 EraseBytes- 00:08:51.346 [2024-04-26 20:04:35.683043] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.683070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.683126] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.346 [2024-04-26 20:04:35.683141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.683194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.346 [2024-04-26 20:04:35.683208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.683262] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.346 [2024-04-26 20:04:35.683277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.346 [2024-04-26 20:04:35.683333] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.346 [2024-04-26 20:04:35.683348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.346 #49 NEW cov: 11936 ft: 14578 corp: 17/251b lim: 25 exec/s: 0 rss: 71Mb L: 25/25 MS: 1 CrossOver- 00:08:51.346 [2024-04-26 20:04:35.722643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.722668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.346 #50 NEW cov: 11936 ft: 14586 corp: 18/257b lim: 25 exec/s: 50 rss: 71Mb L: 6/25 MS: 1 CrossOver- 00:08:51.346 [2024-04-26 20:04:35.762818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.346 [2024-04-26 20:04:35.762844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 #51 NEW cov: 11936 ft: 14637 corp: 19/263b lim: 25 exec/s: 51 rss: 71Mb L: 6/25 MS: 1 ChangeBinInt- 00:08:51.605 [2024-04-26 20:04:35.803038] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.605 [2024-04-26 20:04:35.803064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.803127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.605 [2024-04-26 20:04:35.803142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.605 #52 NEW cov: 11936 ft: 14836 corp: 20/276b lim: 25 exec/s: 52 rss: 71Mb L: 13/25 MS: 1 EraseBytes- 00:08:51.605 [2024-04-26 20:04:35.843316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.605 [2024-04-26 20:04:35.843343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.843392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.605 [2024-04-26 20:04:35.843408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.843482] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.605 [2024-04-26 20:04:35.843498] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.843553] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.605 [2024-04-26 20:04:35.843567] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.605 #53 NEW cov: 11936 ft: 14854 corp: 21/299b lim: 25 exec/s: 53 rss: 71Mb L: 23/25 MS: 1 InsertByte- 00:08:51.605 [2024-04-26 20:04:35.883224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.605 [2024-04-26 20:04:35.883250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.883287] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.605 [2024-04-26 20:04:35.883303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.605 #54 NEW cov: 11936 ft: 14866 corp: 22/310b lim: 25 exec/s: 54 rss: 72Mb L: 11/25 MS: 1 CrossOver- 00:08:51.605 [2024-04-26 20:04:35.923268] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.605 [2024-04-26 20:04:35.923294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 #55 NEW cov: 11936 ft: 14939 corp: 23/315b lim: 25 exec/s: 55 rss: 72Mb L: 5/25 MS: 1 EraseBytes- 00:08:51.605 [2024-04-26 20:04:35.963826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.605 [2024-04-26 20:04:35.963852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.963907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.605 [2024-04-26 20:04:35.963922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.963978] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.605 [2024-04-26 20:04:35.963993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.964048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.605 [2024-04-26 20:04:35.964063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.605 [2024-04-26 20:04:35.964118] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.605 [2024-04-26 20:04:35.964134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.605 #56 NEW cov: 11936 ft: 14969 corp: 24/340b lim: 25 exec/s: 56 rss: 72Mb L: 25/25 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\000"- 00:08:51.605 [2024-04-26 20:04:36.003591] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.606 [2024-04-26 20:04:36.003618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.606 [2024-04-26 20:04:36.003655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.606 [2024-04-26 20:04:36.003670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.606 #57 NEW cov: 11936 ft: 14976 corp: 25/351b lim: 25 exec/s: 57 rss: 72Mb L: 11/25 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:51.606 [2024-04-26 20:04:36.044092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.606 [2024-04-26 20:04:36.044118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.606 [2024-04-26 20:04:36.044176] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.606 [2024-04-26 20:04:36.044191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.606 [2024-04-26 20:04:36.044248] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.606 [2024-04-26 20:04:36.044263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.606 [2024-04-26 20:04:36.044317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.606 [2024-04-26 20:04:36.044332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.606 [2024-04-26 20:04:36.044389] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.606 [2024-04-26 20:04:36.044405] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.865 #58 NEW cov: 11936 ft: 15011 corp: 26/376b lim: 25 exec/s: 58 rss: 72Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:51.865 [2024-04-26 20:04:36.084200] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.084226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.084283] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.865 [2024-04-26 20:04:36.084298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.084352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.865 [2024-04-26 20:04:36.084366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.084420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:51.865 [2024-04-26 20:04:36.084436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.084491] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:51.865 [2024-04-26 20:04:36.084506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:51.865 #59 NEW cov: 11936 ft: 15025 corp: 27/401b lim: 25 exec/s: 59 rss: 72Mb L: 25/25 MS: 1 ChangeBinInt- 00:08:51.865 [2024-04-26 20:04:36.123831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.123857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 #60 NEW cov: 11936 ft: 15050 corp: 28/409b lim: 25 exec/s: 60 rss: 72Mb L: 8/25 MS: 1 InsertByte- 00:08:51.865 [2024-04-26 20:04:36.163939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.163968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 #61 NEW cov: 11936 ft: 15060 corp: 29/414b lim: 25 exec/s: 61 rss: 72Mb L: 5/25 MS: 1 ShuffleBytes- 00:08:51.865 [2024-04-26 20:04:36.204311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.204338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.204392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.865 [2024-04-26 20:04:36.204408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.204465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:51.865 [2024-04-26 20:04:36.204480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:51.865 #62 NEW cov: 11936 ft: 15103 corp: 30/429b lim: 25 exec/s: 62 rss: 72Mb L: 15/25 MS: 1 CrossOver- 00:08:51.865 [2024-04-26 20:04:36.254210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.254238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 #63 NEW cov: 11936 ft: 15116 corp: 31/434b lim: 25 exec/s: 63 rss: 72Mb L: 5/25 MS: 1 ChangeByte- 00:08:51.865 [2024-04-26 20:04:36.294434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:51.865 [2024-04-26 20:04:36.294461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:51.865 [2024-04-26 20:04:36.294499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:51.865 [2024-04-26 20:04:36.294514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.124 #64 NEW cov: 11936 ft: 15139 corp: 32/445b lim: 25 exec/s: 64 rss: 72Mb L: 11/25 MS: 1 CrossOver- 00:08:52.124 [2024-04-26 20:04:36.344700] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.124 [2024-04-26 20:04:36.344727] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.124 [2024-04-26 20:04:36.344771] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.124 [2024-04-26 20:04:36.344786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.124 [2024-04-26 20:04:36.344843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.124 [2024-04-26 20:04:36.344859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.124 #65 NEW cov: 11936 ft: 15144 corp: 33/463b lim: 25 exec/s: 65 rss: 72Mb L: 18/25 MS: 1 InsertByte- 00:08:52.124 [2024-04-26 20:04:36.384825] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.124 [2024-04-26 20:04:36.384852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.124 [2024-04-26 20:04:36.384906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.124 [2024-04-26 20:04:36.384923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.124 [2024-04-26 20:04:36.384995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.124 [2024-04-26 20:04:36.385011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.124 #66 NEW cov: 11936 ft: 15152 corp: 34/478b lim: 25 exec/s: 66 rss: 72Mb L: 15/25 MS: 1 InsertRepeatedBytes- 00:08:52.124 [2024-04-26 20:04:36.425188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.124 [2024-04-26 20:04:36.425218] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.124 [2024-04-26 20:04:36.425270] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.125 [2024-04-26 20:04:36.425285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.125 [2024-04-26 20:04:36.425340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.125 [2024-04-26 20:04:36.425354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.125 [2024-04-26 20:04:36.425406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.125 [2024-04-26 20:04:36.425421] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.125 [2024-04-26 20:04:36.425478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:4 nsid:0 00:08:52.125 [2024-04-26 20:04:36.425494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:52.125 #67 NEW cov: 11936 ft: 15160 corp: 35/503b lim: 25 exec/s: 67 rss: 72Mb L: 25/25 MS: 1 ShuffleBytes- 00:08:52.125 [2024-04-26 20:04:36.464983] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.125 [2024-04-26 20:04:36.465008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.125 [2024-04-26 20:04:36.465044] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.125 [2024-04-26 20:04:36.465059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.125 #68 NEW cov: 11936 ft: 15167 corp: 36/516b lim: 25 exec/s: 68 rss: 73Mb L: 13/25 MS: 1 ShuffleBytes- 00:08:52.125 [2024-04-26 20:04:36.505092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.125 [2024-04-26 20:04:36.505118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.125 [2024-04-26 20:04:36.505157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.125 [2024-04-26 20:04:36.505171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.125 #69 NEW cov: 11936 ft: 15171 corp: 37/530b lim: 25 exec/s: 69 rss: 73Mb L: 14/25 MS: 1 InsertByte- 00:08:52.125 [2024-04-26 20:04:36.545086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.125 [2024-04-26 20:04:36.545112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.125 #70 NEW cov: 11936 ft: 15183 corp: 38/537b lim: 25 exec/s: 70 rss: 73Mb L: 7/25 MS: 1 InsertByte- 00:08:52.384 [2024-04-26 20:04:36.585224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.384 [2024-04-26 20:04:36.585249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.384 #71 NEW cov: 11936 ft: 15198 corp: 39/545b lim: 25 exec/s: 71 rss: 73Mb L: 8/25 MS: 1 CopyPart- 00:08:52.384 [2024-04-26 20:04:36.625635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.384 [2024-04-26 20:04:36.625662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.625715] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.384 [2024-04-26 20:04:36.625733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.625787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.384 [2024-04-26 20:04:36.625801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.625856] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.384 [2024-04-26 20:04:36.625875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.384 #72 NEW cov: 11936 ft: 15218 corp: 40/567b lim: 25 exec/s: 72 rss: 73Mb L: 22/25 MS: 1 EraseBytes- 00:08:52.384 [2024-04-26 20:04:36.665441] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.384 [2024-04-26 20:04:36.665467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.384 #73 NEW cov: 11936 ft: 15223 corp: 41/574b lim: 25 exec/s: 73 rss: 73Mb L: 7/25 MS: 1 CMP- DE: "\000\005"- 00:08:52.384 [2024-04-26 20:04:36.705885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.384 [2024-04-26 20:04:36.705910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.705968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.384 [2024-04-26 20:04:36.705982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.706035] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.384 [2024-04-26 20:04:36.706050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.706104] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:52.384 [2024-04-26 20:04:36.706119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:52.384 #74 NEW cov: 11936 ft: 15232 corp: 42/595b lim: 25 exec/s: 74 rss: 73Mb L: 21/25 MS: 1 CrossOver- 00:08:52.384 [2024-04-26 20:04:36.745842] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:52.384 [2024-04-26 20:04:36.745868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.745925] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:52.384 [2024-04-26 20:04:36.745940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:52.384 [2024-04-26 20:04:36.745995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:52.384 [2024-04-26 20:04:36.746009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:52.384 #75 NEW cov: 11936 ft: 15234 corp: 43/610b lim: 25 exec/s: 37 rss: 73Mb L: 15/25 MS: 1 ShuffleBytes- 00:08:52.384 #75 DONE cov: 11936 ft: 15234 corp: 43/610b lim: 25 exec/s: 37 rss: 73Mb 00:08:52.384 ###### Recommended dictionary. ###### 00:08:52.384 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:52.384 "\000\005" # Uses: 0 00:08:52.384 ###### End of recommended dictionary. ###### 00:08:52.384 Done 75 runs in 2 second(s) 00:08:52.643 20:04:36 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_23.conf /var/tmp/suppress_nvmf_fuzz 00:08:52.643 20:04:36 -- ../common.sh@72 -- # (( i++ )) 00:08:52.643 20:04:36 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:52.643 20:04:36 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:52.643 20:04:36 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:52.643 20:04:36 -- nvmf/run.sh@24 -- # local timen=1 00:08:52.643 20:04:36 -- nvmf/run.sh@25 -- # local core=0x1 00:08:52.643 20:04:36 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.643 20:04:36 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:52.643 20:04:36 -- nvmf/run.sh@28 -- # local suppress_file=/var/tmp/suppress_nvmf_fuzz 00:08:52.643 20:04:36 -- nvmf/run.sh@32 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_nvmf_fuzz:print_suppressions=0 00:08:52.643 20:04:36 -- nvmf/run.sh@34 -- # printf %02d 24 00:08:52.643 20:04:36 -- nvmf/run.sh@34 -- # port=4424 00:08:52.643 20:04:36 -- nvmf/run.sh@35 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.643 20:04:36 -- nvmf/run.sh@37 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:52.643 20:04:36 -- nvmf/run.sh@38 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:52.643 20:04:36 -- nvmf/run.sh@41 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:52.643 20:04:36 -- nvmf/run.sh@42 -- # echo leak:nvmf_ctrlr_create 00:08:52.643 20:04:36 -- nvmf/run.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 00:08:52.643 [2024-04-26 20:04:36.947039] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:52.643 [2024-04-26 20:04:36.947096] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629471 ] 00:08:52.643 EAL: No free 2048 kB hugepages reported on node 1 00:08:52.902 [2024-04-26 20:04:37.147578] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:52.902 [2024-04-26 20:04:37.218359] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.902 [2024-04-26 20:04:37.277505] tcp.c: 669:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:52.902 [2024-04-26 20:04:37.293696] tcp.c: 965:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:52.902 INFO: Running with entropic power schedule (0xFF, 100). 00:08:52.902 INFO: Seed: 3608093175 00:08:52.902 INFO: Loaded 1 modules (348540 inline 8-bit counters): 348540 [0x28b4dcc, 0x2909f48), 00:08:52.902 INFO: Loaded 1 PC tables (348540 PCs): 348540 [0x2909f48,0x2e5b708), 00:08:52.902 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:52.902 INFO: A corpus is not provided, starting from an empty corpus 00:08:52.902 #2 INITED exec/s: 0 rss: 62Mb 00:08:52.902 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:52.902 This may also happen if the target rejected all inputs we tried so far 00:08:52.902 [2024-04-26 20:04:37.342094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:52.902 [2024-04-26 20:04:37.342125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 NEW_FUNC[1/672]: 0x4ad7a0 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:53.418 NEW_FUNC[2/672]: 0x4be400 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:53.418 #5 NEW cov: 11755 ft: 11756 corp: 2/22b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 3 InsertByte-InsertByte-InsertRepeatedBytes- 00:08:53.418 [2024-04-26 20:04:37.662973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.663015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 #6 NEW cov: 11885 ft: 12241 corp: 3/43b lim: 100 exec/s: 0 rss: 69Mb L: 21/21 MS: 1 ShuffleBytes- 00:08:53.418 [2024-04-26 20:04:37.712997] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:13021231110853801140 len:46261 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.713026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 #11 NEW cov: 11891 ft: 12465 corp: 4/66b lim: 100 exec/s: 0 rss: 69Mb L: 23/23 MS: 5 InsertRepeatedBytes-CopyPart-ShuffleBytes-ShuffleBytes-CopyPart- 00:08:53.418 [2024-04-26 20:04:37.753080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.753107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 #12 NEW cov: 11976 ft: 12712 corp: 5/87b lim: 100 exec/s: 0 rss: 70Mb L: 21/23 MS: 1 ShuffleBytes- 00:08:53.418 [2024-04-26 20:04:37.793378] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070526223103 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.793404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 [2024-04-26 20:04:37.793443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.793459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.418 #13 NEW cov: 11976 ft: 13658 corp: 6/133b lim: 100 exec/s: 0 rss: 70Mb L: 46/46 MS: 1 InsertRepeatedBytes- 00:08:53.418 [2024-04-26 20:04:37.833325] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.418 [2024-04-26 20:04:37.833352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.418 #14 NEW cov: 11976 ft: 13765 corp: 7/154b lim: 100 exec/s: 0 rss: 70Mb L: 21/46 MS: 1 ChangeByte- 00:08:53.677 [2024-04-26 20:04:37.873460] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.677 [2024-04-26 20:04:37.873487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.677 #15 NEW cov: 11976 ft: 13834 corp: 8/175b lim: 100 exec/s: 0 rss: 70Mb L: 21/46 MS: 1 ShuffleBytes- 00:08:53.677 [2024-04-26 20:04:37.913548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.913576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.678 #16 NEW cov: 11976 ft: 13890 corp: 9/196b lim: 100 exec/s: 0 rss: 70Mb L: 21/46 MS: 1 ChangeBinInt- 00:08:53.678 [2024-04-26 20:04:37.953666] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.953693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.678 #17 NEW cov: 11976 ft: 13914 corp: 10/217b lim: 100 exec/s: 0 rss: 70Mb L: 21/46 MS: 1 ShuffleBytes- 00:08:53.678 [2024-04-26 20:04:37.994414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.994440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.678 [2024-04-26 20:04:37.994498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6582955727375784795 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.994519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.678 [2024-04-26 20:04:37.994572] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.994587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:53.678 [2024-04-26 20:04:37.994640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.994655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:53.678 [2024-04-26 20:04:37.994709] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:37.994723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:53.678 #18 NEW cov: 11976 ft: 14444 corp: 11/317b lim: 100 exec/s: 0 rss: 70Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:53.678 [2024-04-26 20:04:38.043962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451626356392514 len:67 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:38.043989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.678 #19 NEW cov: 11976 ft: 14465 corp: 12/338b lim: 100 exec/s: 0 rss: 70Mb L: 21/100 MS: 1 CMP- DE: "u\000\000\000"- 00:08:53.678 [2024-04-26 20:04:38.084072] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:17007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.678 [2024-04-26 20:04:38.084100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.678 #20 NEW cov: 11976 ft: 14511 corp: 13/359b lim: 100 exec/s: 0 rss: 70Mb L: 21/100 MS: 1 ChangeBit- 00:08:53.936 [2024-04-26 20:04:38.124214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:17007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.124241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 #21 NEW cov: 11976 ft: 14532 corp: 14/380b lim: 100 exec/s: 0 rss: 70Mb L: 21/100 MS: 1 CrossOver- 00:08:53.936 [2024-04-26 20:04:38.164305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741872450060866 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.164332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 #22 NEW cov: 11976 ft: 14552 corp: 15/401b lim: 100 exec/s: 0 rss: 70Mb L: 21/100 MS: 1 ChangeBinInt- 00:08:53.936 [2024-04-26 20:04:38.204446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451626356392514 len:67 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.204472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 NEW_FUNC[1/1]: 0x19c3f70 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:53.936 #23 NEW cov: 11999 ft: 14607 corp: 16/422b lim: 100 exec/s: 0 rss: 71Mb L: 21/100 MS: 1 ChangeBit- 00:08:53.936 [2024-04-26 20:04:38.244561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.244588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 #24 NEW cov: 11999 ft: 14702 corp: 17/442b lim: 100 exec/s: 0 rss: 71Mb L: 20/100 MS: 1 EraseBytes- 00:08:53.936 [2024-04-26 20:04:38.284677] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070526223103 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.284705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 [2024-04-26 20:04:38.284743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.284758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:53.936 #25 NEW cov: 11999 ft: 14790 corp: 18/488b lim: 100 exec/s: 0 rss: 71Mb L: 46/100 MS: 1 CopyPart- 00:08:53.936 [2024-04-26 20:04:38.334807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451626356392514 len:67 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.334834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:53.936 #26 NEW cov: 11999 ft: 14808 corp: 19/510b lim: 100 exec/s: 26 rss: 71Mb L: 22/100 MS: 1 InsertByte- 00:08:53.936 [2024-04-26 20:04:38.374949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741872450060866 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:53.936 [2024-04-26 20:04:38.374976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.216 #27 NEW cov: 11999 ft: 14823 corp: 20/532b lim: 100 exec/s: 27 rss: 71Mb L: 22/100 MS: 1 InsertByte- 00:08:54.216 [2024-04-26 20:04:38.415026] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:284579480181 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.415053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.216 #28 NEW cov: 11999 ft: 14892 corp: 21/556b lim: 100 exec/s: 28 rss: 71Mb L: 24/100 MS: 1 PersAutoDict- DE: "u\000\000\000"- 00:08:54.216 [2024-04-26 20:04:38.455775] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.455804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.455879] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6582955727375784795 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.455896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.455949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.455966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.456020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.456036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.456091] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.456105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.216 #29 NEW cov: 11999 ft: 14931 corp: 22/656b lim: 100 exec/s: 29 rss: 72Mb L: 100/100 MS: 1 ChangeBinInt- 00:08:54.216 [2024-04-26 20:04:38.505609] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16988 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.505636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.505673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.505689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.505745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.505761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.216 #30 NEW cov: 11999 ft: 15224 corp: 23/735b lim: 100 exec/s: 30 rss: 72Mb L: 79/100 MS: 1 EraseBytes- 00:08:54.216 [2024-04-26 20:04:38.546017] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.546043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.546101] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6582955727375784795 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.546116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.546171] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.546186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.546242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.546256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.216 [2024-04-26 20:04:38.546308] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.546324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.216 #31 NEW cov: 11999 ft: 15257 corp: 24/835b lim: 100 exec/s: 31 rss: 72Mb L: 100/100 MS: 1 CopyPart- 00:08:54.216 [2024-04-26 20:04:38.595593] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741344169083458 len:48574 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.216 [2024-04-26 20:04:38.595620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.217 #32 NEW cov: 11999 ft: 15265 corp: 25/857b lim: 100 exec/s: 32 rss: 72Mb L: 22/100 MS: 1 ShuffleBytes- 00:08:54.217 [2024-04-26 20:04:38.635877] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070526223103 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.217 [2024-04-26 20:04:38.635904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.217 [2024-04-26 20:04:38.635944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.217 [2024-04-26 20:04:38.635963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.528 #33 NEW cov: 11999 ft: 15277 corp: 26/903b lim: 100 exec/s: 33 rss: 72Mb L: 46/100 MS: 1 ShuffleBytes- 00:08:54.528 [2024-04-26 20:04:38.685851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.685882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #34 NEW cov: 11999 ft: 15328 corp: 27/942b lim: 100 exec/s: 34 rss: 72Mb L: 39/100 MS: 1 CopyPart- 00:08:54.528 [2024-04-26 20:04:38.725929] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741872450060866 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.725955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #35 NEW cov: 11999 ft: 15343 corp: 28/964b lim: 100 exec/s: 35 rss: 72Mb L: 22/100 MS: 1 ChangeByte- 00:08:54.528 [2024-04-26 20:04:38.766020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.766046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #36 NEW cov: 11999 ft: 15407 corp: 29/1003b lim: 100 exec/s: 36 rss: 72Mb L: 39/100 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:54.528 [2024-04-26 20:04:38.806154] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.806181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #37 NEW cov: 11999 ft: 15434 corp: 30/1024b lim: 100 exec/s: 37 rss: 72Mb L: 21/100 MS: 1 ShuffleBytes- 00:08:54.528 [2024-04-26 20:04:38.846262] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741344169083458 len:48451 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.846288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #38 NEW cov: 11999 ft: 15440 corp: 31/1046b lim: 100 exec/s: 38 rss: 72Mb L: 22/100 MS: 1 ShuffleBytes- 00:08:54.528 [2024-04-26 20:04:38.886361] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.886389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #39 NEW cov: 11999 ft: 15452 corp: 32/1078b lim: 100 exec/s: 39 rss: 72Mb L: 32/100 MS: 1 EraseBytes- 00:08:54.528 [2024-04-26 20:04:38.926483] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.926511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.528 #40 NEW cov: 11999 ft: 15461 corp: 33/1099b lim: 100 exec/s: 40 rss: 72Mb L: 21/100 MS: 1 ShuffleBytes- 00:08:54.528 [2024-04-26 20:04:38.966592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.528 [2024-04-26 20:04:38.966620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 #41 NEW cov: 11999 ft: 15489 corp: 34/1120b lim: 100 exec/s: 41 rss: 72Mb L: 21/100 MS: 1 CopyPart- 00:08:54.788 [2024-04-26 20:04:39.007259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.007289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 [2024-04-26 20:04:39.007338] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:6582955727375784795 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.007354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.788 [2024-04-26 20:04:39.007406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.007420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:54.788 [2024-04-26 20:04:39.007473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.007487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:54.788 [2024-04-26 20:04:39.007539] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:6582955728264977243 len:23388 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.007553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:54.788 #42 NEW cov: 11999 ft: 15502 corp: 35/1220b lim: 100 exec/s: 42 rss: 72Mb L: 100/100 MS: 1 ShuffleBytes- 00:08:54.788 [2024-04-26 20:04:39.056823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741872450060866 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.056851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 #43 NEW cov: 11999 ft: 15510 corp: 36/1247b lim: 100 exec/s: 43 rss: 72Mb L: 27/100 MS: 1 CrossOver- 00:08:54.788 [2024-04-26 20:04:39.097096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744070526223103 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.097123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 [2024-04-26 20:04:39.097160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18443366373989023743 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.097176] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:54.788 #44 NEW cov: 11999 ft: 15545 corp: 37/1293b lim: 100 exec/s: 44 rss: 73Mb L: 46/100 MS: 1 ChangeByte- 00:08:54.788 [2024-04-26 20:04:39.137054] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.137081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 #45 NEW cov: 11999 ft: 15550 corp: 38/1313b lim: 100 exec/s: 45 rss: 73Mb L: 20/100 MS: 1 ChangeBit- 00:08:54.788 [2024-04-26 20:04:39.177173] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4811741344169083458 len:48451 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.177199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:54.788 #46 NEW cov: 11999 ft: 15596 corp: 39/1335b lim: 100 exec/s: 46 rss: 73Mb L: 22/100 MS: 1 ChangeBit- 00:08:54.788 [2024-04-26 20:04:39.217324] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:54.788 [2024-04-26 20:04:39.217352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.047 #47 NEW cov: 11999 ft: 15640 corp: 40/1358b lim: 100 exec/s: 47 rss: 73Mb L: 23/100 MS: 1 CopyPart- 00:08:55.047 [2024-04-26 20:04:39.257443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16943 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.047 [2024-04-26 20:04:39.257470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.047 #48 NEW cov: 11999 ft: 15676 corp: 41/1378b lim: 100 exec/s: 48 rss: 73Mb L: 20/100 MS: 1 ShuffleBytes- 00:08:55.047 [2024-04-26 20:04:39.297544] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.047 [2024-04-26 20:04:39.297570] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.047 #50 NEW cov: 11999 ft: 15682 corp: 42/1399b lim: 100 exec/s: 50 rss: 73Mb L: 21/100 MS: 2 EraseBytes-CopyPart- 00:08:55.047 [2024-04-26 20:04:39.337795] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4774451407313060418 len:16963 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.047 [2024-04-26 20:04:39.337821] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:55.047 [2024-04-26 20:04:39.337858] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:8430738503547634242 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:55.047 [2024-04-26 20:04:39.337878] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:55.047 #51 NEW cov: 11999 ft: 15688 corp: 43/1446b lim: 100 exec/s: 25 rss: 73Mb L: 47/100 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:55.047 #51 DONE cov: 11999 ft: 15688 corp: 43/1446b lim: 100 exec/s: 25 rss: 73Mb 00:08:55.047 ###### Recommended dictionary. ###### 00:08:55.047 "u\000\000\000" # Uses: 1 00:08:55.048 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:55.048 ###### End of recommended dictionary. ###### 00:08:55.048 Done 51 runs in 2 second(s) 00:08:55.307 20:04:39 -- nvmf/run.sh@54 -- # rm -rf /tmp/fuzz_json_24.conf /var/tmp/suppress_nvmf_fuzz 00:08:55.307 20:04:39 -- ../common.sh@72 -- # (( i++ )) 00:08:55.307 20:04:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.307 20:04:39 -- nvmf/run.sh@79 -- # trap - SIGINT SIGTERM EXIT 00:08:55.307 00:08:55.307 real 1m5.468s 00:08:55.307 user 1m40.535s 00:08:55.307 sys 0m8.183s 00:08:55.307 20:04:39 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:08:55.307 20:04:39 -- common/autotest_common.sh@10 -- # set +x 00:08:55.307 ************************************ 00:08:55.307 END TEST nvmf_fuzz 00:08:55.307 ************************************ 00:08:55.307 20:04:39 -- fuzz/llvm.sh@60 -- # for fuzzer in "${fuzzers[@]}" 00:08:55.307 20:04:39 -- fuzz/llvm.sh@61 -- # case "$fuzzer" in 00:08:55.307 20:04:39 -- fuzz/llvm.sh@63 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:55.307 20:04:39 -- common/autotest_common.sh@1097 -- # '[' 2 -le 1 ']' 00:08:55.307 20:04:39 -- common/autotest_common.sh@1103 -- # xtrace_disable 00:08:55.307 20:04:39 -- common/autotest_common.sh@10 -- # set +x 00:08:55.307 ************************************ 00:08:55.307 START TEST vfio_fuzz 00:08:55.307 ************************************ 00:08:55.307 20:04:39 -- common/autotest_common.sh@1121 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:55.568 * Looking for test storage... 00:08:55.568 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.568 20:04:39 -- vfio/run.sh@64 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:55.568 20:04:39 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:55.568 20:04:39 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:55.568 20:04:39 -- common/autotest_common.sh@34 -- # set -e 00:08:55.568 20:04:39 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:55.568 20:04:39 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:55.568 20:04:39 -- common/autotest_common.sh@38 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:55.568 20:04:39 -- common/autotest_common.sh@43 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:55.568 20:04:39 -- common/autotest_common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:55.568 20:04:39 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:55.568 20:04:39 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:55.568 20:04:39 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:55.568 20:04:39 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:55.568 20:04:39 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:55.568 20:04:39 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:55.568 20:04:39 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:55.568 20:04:39 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:55.568 20:04:39 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:55.569 20:04:39 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:55.569 20:04:39 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:55.569 20:04:39 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:55.569 20:04:39 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:55.569 20:04:39 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:55.569 20:04:39 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:55.569 20:04:39 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:55.569 20:04:39 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:55.569 20:04:39 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:55.569 20:04:39 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:55.569 20:04:39 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:55.569 20:04:39 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:55.569 20:04:39 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:55.569 20:04:39 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:55.569 20:04:39 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:55.569 20:04:39 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:55.569 20:04:39 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:55.569 20:04:39 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:55.569 20:04:39 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:55.569 20:04:39 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:55.569 20:04:39 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:55.569 20:04:39 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:55.569 20:04:39 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:55.569 20:04:39 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:55.569 20:04:39 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:55.569 20:04:39 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:55.569 20:04:39 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:55.569 20:04:39 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:55.569 20:04:39 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:55.569 20:04:39 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:55.569 20:04:39 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:55.569 20:04:39 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:55.569 20:04:39 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:55.569 20:04:39 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:55.569 20:04:39 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:55.569 20:04:39 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:55.569 20:04:39 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:55.569 20:04:39 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:55.569 20:04:39 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:55.569 20:04:39 -- common/build_config.sh@53 -- # CONFIG_HAVE_EVP_MAC=y 00:08:55.569 20:04:39 -- common/build_config.sh@54 -- # CONFIG_URING_ZNS=n 00:08:55.569 20:04:39 -- common/build_config.sh@55 -- # CONFIG_WERROR=y 00:08:55.569 20:04:39 -- common/build_config.sh@56 -- # CONFIG_HAVE_LIBBSD=n 00:08:55.569 20:04:39 -- common/build_config.sh@57 -- # CONFIG_UBSAN=y 00:08:55.569 20:04:39 -- common/build_config.sh@58 -- # CONFIG_IPSEC_MB_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@59 -- # CONFIG_GOLANG=n 00:08:55.569 20:04:39 -- common/build_config.sh@60 -- # CONFIG_ISAL=y 00:08:55.569 20:04:39 -- common/build_config.sh@61 -- # CONFIG_IDXD_KERNEL=n 00:08:55.569 20:04:39 -- common/build_config.sh@62 -- # CONFIG_DPDK_LIB_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@63 -- # CONFIG_RDMA_PROV=verbs 00:08:55.569 20:04:39 -- common/build_config.sh@64 -- # CONFIG_APPS=y 00:08:55.569 20:04:39 -- common/build_config.sh@65 -- # CONFIG_SHARED=n 00:08:55.569 20:04:39 -- common/build_config.sh@66 -- # CONFIG_HAVE_KEYUTILS=n 00:08:55.569 20:04:39 -- common/build_config.sh@67 -- # CONFIG_FC_PATH= 00:08:55.569 20:04:39 -- common/build_config.sh@68 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:55.569 20:04:39 -- common/build_config.sh@69 -- # CONFIG_FC=n 00:08:55.569 20:04:39 -- common/build_config.sh@70 -- # CONFIG_AVAHI=n 00:08:55.569 20:04:39 -- common/build_config.sh@71 -- # CONFIG_FIO_PLUGIN=y 00:08:55.569 20:04:39 -- common/build_config.sh@72 -- # CONFIG_RAID5F=n 00:08:55.569 20:04:39 -- common/build_config.sh@73 -- # CONFIG_EXAMPLES=y 00:08:55.569 20:04:39 -- common/build_config.sh@74 -- # CONFIG_TESTS=y 00:08:55.569 20:04:39 -- common/build_config.sh@75 -- # CONFIG_CRYPTO_MLX5=n 00:08:55.569 20:04:39 -- common/build_config.sh@76 -- # CONFIG_MAX_LCORES= 00:08:55.569 20:04:39 -- common/build_config.sh@77 -- # CONFIG_IPSEC_MB=n 00:08:55.569 20:04:39 -- common/build_config.sh@78 -- # CONFIG_PGO_DIR= 00:08:55.569 20:04:39 -- common/build_config.sh@79 -- # CONFIG_DEBUG=y 00:08:55.569 20:04:39 -- common/build_config.sh@80 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:55.569 20:04:39 -- common/build_config.sh@81 -- # CONFIG_CROSS_PREFIX= 00:08:55.569 20:04:39 -- common/build_config.sh@82 -- # CONFIG_URING=n 00:08:55.569 20:04:39 -- common/autotest_common.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.569 20:04:39 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:55.569 20:04:39 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.569 20:04:39 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:55.569 20:04:39 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.569 20:04:39 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.569 20:04:39 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:55.569 20:04:39 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.569 20:04:39 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:55.569 20:04:39 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:55.569 20:04:39 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:55.569 20:04:39 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:55.569 20:04:39 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:55.569 20:04:39 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:55.569 20:04:39 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:55.569 20:04:39 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:55.569 #define SPDK_CONFIG_H 00:08:55.569 #define SPDK_CONFIG_APPS 1 00:08:55.569 #define SPDK_CONFIG_ARCH native 00:08:55.569 #undef SPDK_CONFIG_ASAN 00:08:55.569 #undef SPDK_CONFIG_AVAHI 00:08:55.569 #undef SPDK_CONFIG_CET 00:08:55.569 #define SPDK_CONFIG_COVERAGE 1 00:08:55.569 #define SPDK_CONFIG_CROSS_PREFIX 00:08:55.569 #undef SPDK_CONFIG_CRYPTO 00:08:55.569 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:55.569 #undef SPDK_CONFIG_CUSTOMOCF 00:08:55.569 #undef SPDK_CONFIG_DAOS 00:08:55.569 #define SPDK_CONFIG_DAOS_DIR 00:08:55.569 #define SPDK_CONFIG_DEBUG 1 00:08:55.569 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:55.569 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:55.569 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:55.569 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:55.569 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:55.569 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:55.569 #define SPDK_CONFIG_EXAMPLES 1 00:08:55.569 #undef SPDK_CONFIG_FC 00:08:55.569 #define SPDK_CONFIG_FC_PATH 00:08:55.569 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:55.569 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:55.569 #undef SPDK_CONFIG_FUSE 00:08:55.569 #define SPDK_CONFIG_FUZZER 1 00:08:55.569 #define SPDK_CONFIG_FUZZER_LIB /usr/lib64/clang/16/lib/libclang_rt.fuzzer_no_main-x86_64.a 00:08:55.569 #undef SPDK_CONFIG_GOLANG 00:08:55.569 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:55.569 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:08:55.569 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:55.569 #undef SPDK_CONFIG_HAVE_KEYUTILS 00:08:55.569 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:55.569 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:55.569 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:55.569 #define SPDK_CONFIG_IDXD 1 00:08:55.569 #undef SPDK_CONFIG_IDXD_KERNEL 00:08:55.569 #undef SPDK_CONFIG_IPSEC_MB 00:08:55.569 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:55.569 #define SPDK_CONFIG_ISAL 1 00:08:55.569 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:55.569 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:55.569 #define SPDK_CONFIG_LIBDIR 00:08:55.569 #undef SPDK_CONFIG_LTO 00:08:55.569 #define SPDK_CONFIG_MAX_LCORES 00:08:55.569 #define SPDK_CONFIG_NVME_CUSE 1 00:08:55.569 #undef SPDK_CONFIG_OCF 00:08:55.569 #define SPDK_CONFIG_OCF_PATH 00:08:55.569 #define SPDK_CONFIG_OPENSSL_PATH 00:08:55.569 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:55.569 #define SPDK_CONFIG_PGO_DIR 00:08:55.569 #undef SPDK_CONFIG_PGO_USE 00:08:55.569 #define SPDK_CONFIG_PREFIX /usr/local 00:08:55.569 #undef SPDK_CONFIG_RAID5F 00:08:55.569 #undef SPDK_CONFIG_RBD 00:08:55.569 #define SPDK_CONFIG_RDMA 1 00:08:55.569 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:55.569 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:55.569 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:55.569 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:55.569 #undef SPDK_CONFIG_SHARED 00:08:55.569 #undef SPDK_CONFIG_SMA 00:08:55.569 #define SPDK_CONFIG_TESTS 1 00:08:55.569 #undef SPDK_CONFIG_TSAN 00:08:55.569 #define SPDK_CONFIG_UBLK 1 00:08:55.569 #define SPDK_CONFIG_UBSAN 1 00:08:55.569 #undef SPDK_CONFIG_UNIT_TESTS 00:08:55.569 #undef SPDK_CONFIG_URING 00:08:55.569 #define SPDK_CONFIG_URING_PATH 00:08:55.569 #undef SPDK_CONFIG_URING_ZNS 00:08:55.569 #undef SPDK_CONFIG_USDT 00:08:55.569 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:55.569 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:55.569 #define SPDK_CONFIG_VFIO_USER 1 00:08:55.569 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:55.569 #define SPDK_CONFIG_VHOST 1 00:08:55.569 #define SPDK_CONFIG_VIRTIO 1 00:08:55.569 #undef SPDK_CONFIG_VTUNE 00:08:55.569 #define SPDK_CONFIG_VTUNE_DIR 00:08:55.569 #define SPDK_CONFIG_WERROR 1 00:08:55.569 #define SPDK_CONFIG_WPDK_DIR 00:08:55.569 #undef SPDK_CONFIG_XNVME 00:08:55.569 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:55.569 20:04:39 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:55.569 20:04:39 -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:55.569 20:04:39 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:55.569 20:04:39 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:55.569 20:04:39 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:55.569 20:04:39 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.569 20:04:39 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.569 20:04:39 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.569 20:04:39 -- paths/export.sh@5 -- # export PATH 00:08:55.569 20:04:39 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:55.569 20:04:39 -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.569 20:04:39 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:55.569 20:04:39 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.569 20:04:39 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:55.569 20:04:39 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:55.569 20:04:39 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:55.569 20:04:39 -- pm/common@67 -- # TEST_TAG=N/A 00:08:55.569 20:04:39 -- pm/common@68 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:55.569 20:04:39 -- pm/common@70 -- # PM_OUTPUTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:55.569 20:04:39 -- pm/common@71 -- # uname -s 00:08:55.569 20:04:39 -- pm/common@71 -- # PM_OS=Linux 00:08:55.569 20:04:39 -- pm/common@73 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:08:55.569 20:04:39 -- pm/common@74 -- # [[ Linux == FreeBSD ]] 00:08:55.569 20:04:39 -- pm/common@76 -- # [[ Linux == Linux ]] 00:08:55.569 20:04:39 -- pm/common@76 -- # [[ ............................... != QEMU ]] 00:08:55.569 20:04:39 -- pm/common@76 -- # [[ ! -e /.dockerenv ]] 00:08:55.569 20:04:39 -- pm/common@79 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:08:55.569 20:04:39 -- pm/common@80 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:08:55.569 20:04:39 -- pm/common@83 -- # MONITOR_RESOURCES_PIDS=() 00:08:55.569 20:04:39 -- pm/common@83 -- # declare -A MONITOR_RESOURCES_PIDS 00:08:55.569 20:04:39 -- pm/common@85 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:08:55.569 20:04:39 -- common/autotest_common.sh@57 -- # : 0 00:08:55.569 20:04:39 -- common/autotest_common.sh@58 -- # export RUN_NIGHTLY 00:08:55.569 20:04:39 -- common/autotest_common.sh@61 -- # : 0 00:08:55.569 20:04:39 -- common/autotest_common.sh@62 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:55.569 20:04:39 -- common/autotest_common.sh@63 -- # : 0 00:08:55.569 20:04:39 -- common/autotest_common.sh@64 -- # export SPDK_RUN_VALGRIND 00:08:55.570 20:04:39 -- common/autotest_common.sh@65 -- # : 1 00:08:55.570 20:04:39 -- common/autotest_common.sh@66 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:55.570 20:04:39 -- common/autotest_common.sh@67 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@68 -- # export SPDK_TEST_UNITTEST 00:08:55.570 20:04:39 -- common/autotest_common.sh@69 -- # : 00:08:55.570 20:04:39 -- common/autotest_common.sh@70 -- # export SPDK_TEST_AUTOBUILD 00:08:55.570 20:04:39 -- common/autotest_common.sh@71 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@72 -- # export SPDK_TEST_RELEASE_BUILD 00:08:55.570 20:04:39 -- common/autotest_common.sh@73 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@74 -- # export SPDK_TEST_ISAL 00:08:55.570 20:04:39 -- common/autotest_common.sh@75 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@76 -- # export SPDK_TEST_ISCSI 00:08:55.570 20:04:39 -- common/autotest_common.sh@77 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@78 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:55.570 20:04:39 -- common/autotest_common.sh@79 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@80 -- # export SPDK_TEST_NVME 00:08:55.570 20:04:39 -- common/autotest_common.sh@81 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@82 -- # export SPDK_TEST_NVME_PMR 00:08:55.570 20:04:39 -- common/autotest_common.sh@83 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@84 -- # export SPDK_TEST_NVME_BP 00:08:55.570 20:04:39 -- common/autotest_common.sh@85 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@86 -- # export SPDK_TEST_NVME_CLI 00:08:55.570 20:04:39 -- common/autotest_common.sh@87 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@88 -- # export SPDK_TEST_NVME_CUSE 00:08:55.570 20:04:39 -- common/autotest_common.sh@89 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@90 -- # export SPDK_TEST_NVME_FDP 00:08:55.570 20:04:39 -- common/autotest_common.sh@91 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@92 -- # export SPDK_TEST_NVMF 00:08:55.570 20:04:39 -- common/autotest_common.sh@93 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@94 -- # export SPDK_TEST_VFIOUSER 00:08:55.570 20:04:39 -- common/autotest_common.sh@95 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@96 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:55.570 20:04:39 -- common/autotest_common.sh@97 -- # : 1 00:08:55.570 20:04:39 -- common/autotest_common.sh@98 -- # export SPDK_TEST_FUZZER 00:08:55.570 20:04:39 -- common/autotest_common.sh@99 -- # : 1 00:08:55.570 20:04:39 -- common/autotest_common.sh@100 -- # export SPDK_TEST_FUZZER_SHORT 00:08:55.570 20:04:39 -- common/autotest_common.sh@101 -- # : rdma 00:08:55.570 20:04:39 -- common/autotest_common.sh@102 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:55.570 20:04:39 -- common/autotest_common.sh@103 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@104 -- # export SPDK_TEST_RBD 00:08:55.570 20:04:39 -- common/autotest_common.sh@105 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@106 -- # export SPDK_TEST_VHOST 00:08:55.570 20:04:39 -- common/autotest_common.sh@107 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@108 -- # export SPDK_TEST_BLOCKDEV 00:08:55.570 20:04:39 -- common/autotest_common.sh@109 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@110 -- # export SPDK_TEST_IOAT 00:08:55.570 20:04:39 -- common/autotest_common.sh@111 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@112 -- # export SPDK_TEST_BLOBFS 00:08:55.570 20:04:39 -- common/autotest_common.sh@113 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@114 -- # export SPDK_TEST_VHOST_INIT 00:08:55.570 20:04:39 -- common/autotest_common.sh@115 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@116 -- # export SPDK_TEST_LVOL 00:08:55.570 20:04:39 -- common/autotest_common.sh@117 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@118 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:55.570 20:04:39 -- common/autotest_common.sh@119 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@120 -- # export SPDK_RUN_ASAN 00:08:55.570 20:04:39 -- common/autotest_common.sh@121 -- # : 1 00:08:55.570 20:04:39 -- common/autotest_common.sh@122 -- # export SPDK_RUN_UBSAN 00:08:55.570 20:04:39 -- common/autotest_common.sh@123 -- # : 00:08:55.570 20:04:39 -- common/autotest_common.sh@124 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:55.570 20:04:39 -- common/autotest_common.sh@125 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@126 -- # export SPDK_RUN_NON_ROOT 00:08:55.570 20:04:39 -- common/autotest_common.sh@127 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@128 -- # export SPDK_TEST_CRYPTO 00:08:55.570 20:04:39 -- common/autotest_common.sh@129 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@130 -- # export SPDK_TEST_FTL 00:08:55.570 20:04:39 -- common/autotest_common.sh@131 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@132 -- # export SPDK_TEST_OCF 00:08:55.570 20:04:39 -- common/autotest_common.sh@133 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@134 -- # export SPDK_TEST_VMD 00:08:55.570 20:04:39 -- common/autotest_common.sh@135 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@136 -- # export SPDK_TEST_OPAL 00:08:55.570 20:04:39 -- common/autotest_common.sh@137 -- # : 00:08:55.570 20:04:39 -- common/autotest_common.sh@138 -- # export SPDK_TEST_NATIVE_DPDK 00:08:55.570 20:04:39 -- common/autotest_common.sh@139 -- # : true 00:08:55.570 20:04:39 -- common/autotest_common.sh@140 -- # export SPDK_AUTOTEST_X 00:08:55.570 20:04:39 -- common/autotest_common.sh@141 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@142 -- # export SPDK_TEST_RAID5 00:08:55.570 20:04:39 -- common/autotest_common.sh@143 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@144 -- # export SPDK_TEST_URING 00:08:55.570 20:04:39 -- common/autotest_common.sh@145 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@146 -- # export SPDK_TEST_USDT 00:08:55.570 20:04:39 -- common/autotest_common.sh@147 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@148 -- # export SPDK_TEST_USE_IGB_UIO 00:08:55.570 20:04:39 -- common/autotest_common.sh@149 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@150 -- # export SPDK_TEST_SCHEDULER 00:08:55.570 20:04:39 -- common/autotest_common.sh@151 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@152 -- # export SPDK_TEST_SCANBUILD 00:08:55.570 20:04:39 -- common/autotest_common.sh@153 -- # : 00:08:55.570 20:04:39 -- common/autotest_common.sh@154 -- # export SPDK_TEST_NVMF_NICS 00:08:55.570 20:04:39 -- common/autotest_common.sh@155 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@156 -- # export SPDK_TEST_SMA 00:08:55.570 20:04:39 -- common/autotest_common.sh@157 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@158 -- # export SPDK_TEST_DAOS 00:08:55.570 20:04:39 -- common/autotest_common.sh@159 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@160 -- # export SPDK_TEST_XNVME 00:08:55.570 20:04:39 -- common/autotest_common.sh@161 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@162 -- # export SPDK_TEST_ACCEL_DSA 00:08:55.570 20:04:39 -- common/autotest_common.sh@163 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@164 -- # export SPDK_TEST_ACCEL_IAA 00:08:55.570 20:04:39 -- common/autotest_common.sh@166 -- # : 00:08:55.570 20:04:39 -- common/autotest_common.sh@167 -- # export SPDK_TEST_FUZZER_TARGET 00:08:55.570 20:04:39 -- common/autotest_common.sh@168 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@169 -- # export SPDK_TEST_NVMF_MDNS 00:08:55.570 20:04:39 -- common/autotest_common.sh@170 -- # : 0 00:08:55.570 20:04:39 -- common/autotest_common.sh@171 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:55.570 20:04:39 -- common/autotest_common.sh@174 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@174 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@175 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@175 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@176 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@176 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@177 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@177 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:55.570 20:04:39 -- common/autotest_common.sh@180 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@180 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@184 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.570 20:04:39 -- common/autotest_common.sh@184 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:55.570 20:04:39 -- common/autotest_common.sh@188 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:55.570 20:04:39 -- common/autotest_common.sh@188 -- # PYTHONDONTWRITEBYTECODE=1 00:08:55.570 20:04:39 -- common/autotest_common.sh@192 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.570 20:04:39 -- common/autotest_common.sh@192 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:55.570 20:04:39 -- common/autotest_common.sh@193 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.570 20:04:39 -- common/autotest_common.sh@193 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:55.570 20:04:39 -- common/autotest_common.sh@197 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:55.570 20:04:39 -- common/autotest_common.sh@198 -- # rm -rf /var/tmp/asan_suppression_file 00:08:55.570 20:04:39 -- common/autotest_common.sh@199 -- # cat 00:08:55.570 20:04:39 -- common/autotest_common.sh@235 -- # echo leak:libfuse3.so 00:08:55.570 20:04:39 -- common/autotest_common.sh@237 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.570 20:04:39 -- common/autotest_common.sh@237 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:55.570 20:04:39 -- common/autotest_common.sh@239 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.570 20:04:39 -- common/autotest_common.sh@239 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:55.570 20:04:39 -- common/autotest_common.sh@241 -- # '[' -z /var/spdk/dependencies ']' 00:08:55.570 20:04:39 -- common/autotest_common.sh@244 -- # export DEPENDENCY_DIR 00:08:55.570 20:04:39 -- common/autotest_common.sh@248 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.570 20:04:39 -- common/autotest_common.sh@248 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:55.570 20:04:39 -- common/autotest_common.sh@249 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.570 20:04:39 -- common/autotest_common.sh@249 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:55.570 20:04:39 -- common/autotest_common.sh@252 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.570 20:04:39 -- common/autotest_common.sh@252 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:55.570 20:04:39 -- common/autotest_common.sh@253 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.570 20:04:39 -- common/autotest_common.sh@253 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:55.570 20:04:39 -- common/autotest_common.sh@255 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.570 20:04:39 -- common/autotest_common.sh@255 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:55.570 20:04:39 -- common/autotest_common.sh@258 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@258 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@261 -- # '[' 0 -eq 0 ']' 00:08:55.570 20:04:39 -- common/autotest_common.sh@262 -- # export valgrind= 00:08:55.570 20:04:39 -- common/autotest_common.sh@262 -- # valgrind= 00:08:55.570 20:04:39 -- common/autotest_common.sh@268 -- # uname -s 00:08:55.570 20:04:39 -- common/autotest_common.sh@268 -- # '[' Linux = Linux ']' 00:08:55.570 20:04:39 -- common/autotest_common.sh@269 -- # HUGEMEM=4096 00:08:55.570 20:04:39 -- common/autotest_common.sh@270 -- # export CLEAR_HUGE=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@270 -- # CLEAR_HUGE=yes 00:08:55.570 20:04:39 -- common/autotest_common.sh@271 -- # [[ 0 -eq 1 ]] 00:08:55.570 20:04:39 -- common/autotest_common.sh@271 -- # [[ 0 -eq 1 ]] 00:08:55.570 20:04:39 -- common/autotest_common.sh@278 -- # MAKE=make 00:08:55.570 20:04:39 -- common/autotest_common.sh@279 -- # MAKEFLAGS=-j72 00:08:55.570 20:04:39 -- common/autotest_common.sh@295 -- # export HUGEMEM=4096 00:08:55.570 20:04:39 -- common/autotest_common.sh@295 -- # HUGEMEM=4096 00:08:55.570 20:04:39 -- common/autotest_common.sh@297 -- # NO_HUGE=() 00:08:55.570 20:04:39 -- common/autotest_common.sh@298 -- # TEST_MODE= 00:08:55.570 20:04:39 -- common/autotest_common.sh@317 -- # [[ -z 1629871 ]] 00:08:55.570 20:04:39 -- common/autotest_common.sh@317 -- # kill -0 1629871 00:08:55.570 20:04:39 -- common/autotest_common.sh@1676 -- # set_test_storage 2147483648 00:08:55.570 20:04:39 -- common/autotest_common.sh@327 -- # [[ -v testdir ]] 00:08:55.570 20:04:39 -- common/autotest_common.sh@329 -- # local requested_size=2147483648 00:08:55.570 20:04:39 -- common/autotest_common.sh@330 -- # local mount target_dir 00:08:55.570 20:04:39 -- common/autotest_common.sh@332 -- # local -A mounts fss sizes avails uses 00:08:55.570 20:04:39 -- common/autotest_common.sh@333 -- # local source fs size avail mount use 00:08:55.570 20:04:39 -- common/autotest_common.sh@335 -- # local storage_fallback storage_candidates 00:08:55.570 20:04:39 -- common/autotest_common.sh@337 -- # mktemp -udt spdk.XXXXXX 00:08:55.570 20:04:39 -- common/autotest_common.sh@337 -- # storage_fallback=/tmp/spdk.T6Ad6e 00:08:55.570 20:04:39 -- common/autotest_common.sh@342 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:55.570 20:04:39 -- common/autotest_common.sh@344 -- # [[ -n '' ]] 00:08:55.570 20:04:39 -- common/autotest_common.sh@349 -- # [[ -n '' ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@354 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.T6Ad6e/tests/vfio /tmp/spdk.T6Ad6e 00:08:55.571 20:04:39 -- common/autotest_common.sh@357 -- # requested_size=2214592512 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@326 -- # df -T 00:08:55.571 20:04:39 -- common/autotest_common.sh@326 -- # grep -v Filesystem 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_devtmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=devtmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=67108864 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=67108864 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=0 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=/dev/pmem0 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=ext2 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=818380800 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=5284429824 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=4466049024 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=spdk_root 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=overlay 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=87031554048 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=94508552192 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=7476998144 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=47251660800 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254274048 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=2613248 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=18895634432 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=18901712896 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=6078464 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=47253700608 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=47254278144 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=577536 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # mounts["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@360 -- # fss["$mount"]=tmpfs 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # avails["$mount"]=9450848256 00:08:55.571 20:04:39 -- common/autotest_common.sh@361 -- # sizes["$mount"]=9450852352 00:08:55.571 20:04:39 -- common/autotest_common.sh@362 -- # uses["$mount"]=4096 00:08:55.571 20:04:39 -- common/autotest_common.sh@359 -- # read -r source fs size use avail _ mount 00:08:55.571 20:04:39 -- common/autotest_common.sh@365 -- # printf '* Looking for test storage...\n' 00:08:55.571 * Looking for test storage... 00:08:55.571 20:04:39 -- common/autotest_common.sh@367 -- # local target_space new_size 00:08:55.571 20:04:39 -- common/autotest_common.sh@368 -- # for target_dir in "${storage_candidates[@]}" 00:08:55.571 20:04:39 -- common/autotest_common.sh@371 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.571 20:04:39 -- common/autotest_common.sh@371 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:55.571 20:04:39 -- common/autotest_common.sh@371 -- # mount=/ 00:08:55.571 20:04:39 -- common/autotest_common.sh@373 -- # target_space=87031554048 00:08:55.571 20:04:39 -- common/autotest_common.sh@374 -- # (( target_space == 0 || target_space < requested_size )) 00:08:55.571 20:04:39 -- common/autotest_common.sh@377 -- # (( target_space >= requested_size )) 00:08:55.571 20:04:39 -- common/autotest_common.sh@379 -- # [[ overlay == tmpfs ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@379 -- # [[ overlay == ramfs ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@379 -- # [[ / == / ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@380 -- # new_size=9691590656 00:08:55.571 20:04:39 -- common/autotest_common.sh@381 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:55.571 20:04:39 -- common/autotest_common.sh@386 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.571 20:04:39 -- common/autotest_common.sh@386 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.571 20:04:39 -- common/autotest_common.sh@387 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.571 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:55.571 20:04:39 -- common/autotest_common.sh@388 -- # return 0 00:08:55.571 20:04:39 -- common/autotest_common.sh@1678 -- # set -o errtrace 00:08:55.571 20:04:39 -- common/autotest_common.sh@1679 -- # shopt -s extdebug 00:08:55.571 20:04:39 -- common/autotest_common.sh@1680 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:55.571 20:04:39 -- common/autotest_common.sh@1682 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:55.571 20:04:39 -- common/autotest_common.sh@1683 -- # true 00:08:55.571 20:04:39 -- common/autotest_common.sh@1685 -- # xtrace_fd 00:08:55.571 20:04:39 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:55.571 20:04:39 -- common/autotest_common.sh@27 -- # exec 00:08:55.571 20:04:39 -- common/autotest_common.sh@29 -- # exec 00:08:55.571 20:04:39 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:55.571 20:04:39 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:55.571 20:04:39 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:55.571 20:04:39 -- common/autotest_common.sh@18 -- # set -x 00:08:55.571 20:04:39 -- vfio/run.sh@65 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:55.571 20:04:39 -- ../common.sh@8 -- # pids=() 00:08:55.571 20:04:39 -- vfio/run.sh@67 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.571 20:04:39 -- vfio/run.sh@68 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:55.571 20:04:39 -- vfio/run.sh@68 -- # fuzz_num=7 00:08:55.571 20:04:39 -- vfio/run.sh@69 -- # (( fuzz_num != 0 )) 00:08:55.571 20:04:39 -- vfio/run.sh@71 -- # trap 'cleanup /tmp/vfio-user-* /var/tmp/suppress_vfio_fuzz; exit 1' SIGINT SIGTERM EXIT 00:08:55.571 20:04:39 -- vfio/run.sh@74 -- # mem_size=0 00:08:55.571 20:04:39 -- vfio/run.sh@75 -- # [[ 1 -eq 1 ]] 00:08:55.571 20:04:39 -- vfio/run.sh@76 -- # start_llvm_fuzz_short 7 1 00:08:55.571 20:04:39 -- ../common.sh@69 -- # local fuzz_num=7 00:08:55.571 20:04:39 -- ../common.sh@70 -- # local time=1 00:08:55.571 20:04:39 -- ../common.sh@72 -- # (( i = 0 )) 00:08:55.571 20:04:39 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:55.571 20:04:39 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:55.571 20:04:39 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:55.571 20:04:39 -- vfio/run.sh@23 -- # local timen=1 00:08:55.571 20:04:39 -- vfio/run.sh@24 -- # local core=0x1 00:08:55.571 20:04:39 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.571 20:04:39 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:55.571 20:04:39 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:55.571 20:04:39 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:55.571 20:04:39 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:55.571 20:04:39 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:55.571 20:04:39 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:55.571 20:04:39 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:55.571 20:04:39 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:55.571 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:55.571 20:04:40 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:55.571 20:04:40 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:55.571 20:04:40 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:55.830 [2024-04-26 20:04:40.032765] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:55.830 [2024-04-26 20:04:40.032858] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1629918 ] 00:08:55.830 EAL: No free 2048 kB hugepages reported on node 1 00:08:55.830 [2024-04-26 20:04:40.124417] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.830 [2024-04-26 20:04:40.207490] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:56.089 INFO: Running with entropic power schedule (0xFF, 100). 00:08:56.089 INFO: Seed: 2417162351 00:08:56.089 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:08:56.089 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:08:56.089 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:56.089 INFO: A corpus is not provided, starting from an empty corpus 00:08:56.089 #2 INITED exec/s: 0 rss: 63Mb 00:08:56.089 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:56.089 This may also happen if the target rejected all inputs we tried so far 00:08:56.089 [2024-04-26 20:04:40.471070] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: enabling controller 00:08:56.606 NEW_FUNC[1/635]: 0x481720 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:84 00:08:56.606 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:56.606 #7 NEW cov: 10833 ft: 10348 corp: 2/7b lim: 6 exec/s: 0 rss: 69Mb L: 6/6 MS: 5 InsertRepeatedBytes-EraseBytes-ChangeBinInt-ShuffleBytes-CopyPart- 00:08:56.606 #10 NEW cov: 10860 ft: 13397 corp: 3/13b lim: 6 exec/s: 0 rss: 70Mb L: 6/6 MS: 3 ChangeBit-InsertByte-InsertRepeatedBytes- 00:08:56.864 #12 NEW cov: 10860 ft: 15173 corp: 4/19b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 2 EraseBytes-CopyPart- 00:08:56.864 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:56.864 #13 NEW cov: 10877 ft: 15296 corp: 5/25b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CopyPart- 00:08:57.122 #14 NEW cov: 10877 ft: 15709 corp: 6/31b lim: 6 exec/s: 0 rss: 72Mb L: 6/6 MS: 1 CrossOver- 00:08:57.122 #15 NEW cov: 10877 ft: 15898 corp: 7/37b lim: 6 exec/s: 15 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:57.380 #16 NEW cov: 10877 ft: 16298 corp: 8/43b lim: 6 exec/s: 16 rss: 72Mb L: 6/6 MS: 1 ChangeBinInt- 00:08:57.380 #22 NEW cov: 10877 ft: 16612 corp: 9/49b lim: 6 exec/s: 22 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:57.638 #28 NEW cov: 10877 ft: 16720 corp: 10/55b lim: 6 exec/s: 28 rss: 72Mb L: 6/6 MS: 1 ChangeBit- 00:08:57.638 #29 NEW cov: 10877 ft: 16848 corp: 11/61b lim: 6 exec/s: 29 rss: 72Mb L: 6/6 MS: 1 CMP- DE: "\200\000\000\000"- 00:08:57.897 #30 NEW cov: 10877 ft: 16871 corp: 12/67b lim: 6 exec/s: 30 rss: 72Mb L: 6/6 MS: 1 PersAutoDict- DE: "\200\000\000\000"- 00:08:57.897 #31 NEW cov: 10884 ft: 16982 corp: 13/73b lim: 6 exec/s: 31 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:58.154 #37 NEW cov: 10884 ft: 16992 corp: 14/79b lim: 6 exec/s: 37 rss: 72Mb L: 6/6 MS: 1 ChangeByte- 00:08:58.154 #38 NEW cov: 10884 ft: 17107 corp: 15/85b lim: 6 exec/s: 19 rss: 72Mb L: 6/6 MS: 1 ShuffleBytes- 00:08:58.154 #38 DONE cov: 10884 ft: 17107 corp: 15/85b lim: 6 exec/s: 19 rss: 72Mb 00:08:58.154 ###### Recommended dictionary. ###### 00:08:58.154 "\200\000\000\000" # Uses: 1 00:08:58.154 ###### End of recommended dictionary. ###### 00:08:58.154 Done 38 runs in 2 second(s) 00:08:58.154 [2024-04-26 20:04:42.498078] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-0/domain/2: disabling controller 00:08:58.413 20:04:42 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-0 /var/tmp/suppress_vfio_fuzz 00:08:58.413 20:04:42 -- ../common.sh@72 -- # (( i++ )) 00:08:58.413 20:04:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:58.413 20:04:42 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:58.413 20:04:42 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:58.413 20:04:42 -- vfio/run.sh@23 -- # local timen=1 00:08:58.413 20:04:42 -- vfio/run.sh@24 -- # local core=0x1 00:08:58.413 20:04:42 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.413 20:04:42 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:58.413 20:04:42 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:58.413 20:04:42 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:58.413 20:04:42 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:58.413 20:04:42 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:08:58.413 20:04:42 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:08:58.413 20:04:42 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.413 20:04:42 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:58.413 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:58.413 20:04:42 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:08:58.413 20:04:42 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:08:58.413 20:04:42 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:58.413 [2024-04-26 20:04:42.813549] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:08:58.413 [2024-04-26 20:04:42.813627] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630274 ] 00:08:58.413 EAL: No free 2048 kB hugepages reported on node 1 00:08:58.671 [2024-04-26 20:04:42.900444] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:58.671 [2024-04-26 20:04:42.983750] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.930 INFO: Running with entropic power schedule (0xFF, 100). 00:08:58.930 INFO: Seed: 894181741 00:08:58.930 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:08:58.930 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:08:58.930 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:58.930 INFO: A corpus is not provided, starting from an empty corpus 00:08:58.930 #2 INITED exec/s: 0 rss: 64Mb 00:08:58.930 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:58.930 This may also happen if the target rejected all inputs we tried so far 00:08:58.930 [2024-04-26 20:04:43.238818] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: enabling controller 00:08:58.930 [2024-04-26 20:04:43.291909] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:58.930 [2024-04-26 20:04:43.291935] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:58.930 [2024-04-26 20:04:43.291969] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.447 NEW_FUNC[1/635]: 0x481cc0 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:71 00:08:59.447 NEW_FUNC[2/635]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:59.447 #16 NEW cov: 10784 ft: 10771 corp: 2/5b lim: 4 exec/s: 0 rss: 70Mb L: 4/4 MS: 4 CopyPart-CrossOver-CrossOver-InsertByte- 00:08:59.447 [2024-04-26 20:04:43.792809] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.447 [2024-04-26 20:04:43.792848] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.447 [2024-04-26 20:04:43.792866] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.705 NEW_FUNC[1/2]: 0x1674fb0 in nvme_pcie_qpair_process_completions /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_pcie_common.c:826 00:08:59.705 NEW_FUNC[2/2]: 0x1684d80 in nvme_payload_type /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:260 00:08:59.705 #17 NEW cov: 10846 ft: 13987 corp: 3/9b lim: 4 exec/s: 0 rss: 71Mb L: 4/4 MS: 1 CopyPart- 00:08:59.705 [2024-04-26 20:04:43.995638] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.705 [2024-04-26 20:04:43.995662] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.705 [2024-04-26 20:04:43.995695] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.705 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:59.705 #20 NEW cov: 10863 ft: 14841 corp: 4/13b lim: 4 exec/s: 0 rss: 72Mb L: 4/4 MS: 3 InsertByte-CopyPart-CopyPart- 00:08:59.964 [2024-04-26 20:04:44.197384] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.964 [2024-04-26 20:04:44.197407] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.964 [2024-04-26 20:04:44.197439] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:59.964 #26 NEW cov: 10863 ft: 15490 corp: 5/17b lim: 4 exec/s: 26 rss: 72Mb L: 4/4 MS: 1 ChangeBinInt- 00:08:59.964 [2024-04-26 20:04:44.387919] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:59.964 [2024-04-26 20:04:44.387943] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:59.964 [2024-04-26 20:04:44.387975] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.222 #27 NEW cov: 10863 ft: 16730 corp: 6/21b lim: 4 exec/s: 27 rss: 72Mb L: 4/4 MS: 1 ChangeBit- 00:09:00.222 [2024-04-26 20:04:44.579277] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.222 [2024-04-26 20:04:44.579300] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.222 [2024-04-26 20:04:44.579332] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.480 #31 NEW cov: 10863 ft: 17127 corp: 7/25b lim: 4 exec/s: 31 rss: 72Mb L: 4/4 MS: 4 ShuffleBytes-InsertByte-ChangeByte-CopyPart- 00:09:00.480 [2024-04-26 20:04:44.781517] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.480 [2024-04-26 20:04:44.781539] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.480 [2024-04-26 20:04:44.781557] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.480 #32 NEW cov: 10863 ft: 17242 corp: 8/29b lim: 4 exec/s: 32 rss: 72Mb L: 4/4 MS: 1 ChangeBit- 00:09:00.738 [2024-04-26 20:04:44.972202] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.738 [2024-04-26 20:04:44.972224] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.738 [2024-04-26 20:04:44.972241] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.738 #33 NEW cov: 10870 ft: 17505 corp: 9/33b lim: 4 exec/s: 33 rss: 72Mb L: 4/4 MS: 1 CopyPart- 00:09:00.738 [2024-04-26 20:04:45.163613] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:09:00.738 [2024-04-26 20:04:45.163635] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:09:00.738 [2024-04-26 20:04:45.163652] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:09:00.997 #34 NEW cov: 10870 ft: 17620 corp: 10/37b lim: 4 exec/s: 17 rss: 72Mb L: 4/4 MS: 1 CrossOver- 00:09:00.997 #34 DONE cov: 10870 ft: 17620 corp: 10/37b lim: 4 exec/s: 17 rss: 72Mb 00:09:00.997 Done 34 runs in 2 second(s) 00:09:00.997 [2024-04-26 20:04:45.301088] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-1/domain/2: disabling controller 00:09:01.255 20:04:45 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-1 /var/tmp/suppress_vfio_fuzz 00:09:01.255 20:04:45 -- ../common.sh@72 -- # (( i++ )) 00:09:01.255 20:04:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:01.255 20:04:45 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:09:01.255 20:04:45 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:09:01.255 20:04:45 -- vfio/run.sh@23 -- # local timen=1 00:09:01.255 20:04:45 -- vfio/run.sh@24 -- # local core=0x1 00:09:01.255 20:04:45 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:01.255 20:04:45 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:09:01.255 20:04:45 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:09:01.255 20:04:45 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:09:01.255 20:04:45 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:09:01.255 20:04:45 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:01.255 20:04:45 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:01.255 20:04:45 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:01.255 20:04:45 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:09:01.255 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:01.255 20:04:45 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:01.255 20:04:45 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:01.255 20:04:45 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:09:01.255 [2024-04-26 20:04:45.626517] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:09:01.255 [2024-04-26 20:04:45.626597] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630635 ] 00:09:01.255 EAL: No free 2048 kB hugepages reported on node 1 00:09:01.514 [2024-04-26 20:04:45.716510] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.514 [2024-04-26 20:04:45.804494] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:01.772 INFO: Running with entropic power schedule (0xFF, 100). 00:09:01.772 INFO: Seed: 3712182012 00:09:01.772 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:09:01.772 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:09:01.772 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:09:01.772 INFO: A corpus is not provided, starting from an empty corpus 00:09:01.772 #2 INITED exec/s: 0 rss: 63Mb 00:09:01.772 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:01.772 This may also happen if the target rejected all inputs we tried so far 00:09:01.772 [2024-04-26 20:04:46.055790] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: enabling controller 00:09:01.772 [2024-04-26 20:04:46.123517] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.289 NEW_FUNC[1/636]: 0x4826a0 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:103 00:09:02.289 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:02.289 #10 NEW cov: 10815 ft: 10751 corp: 2/9b lim: 8 exec/s: 0 rss: 70Mb L: 8/8 MS: 3 InsertByte-InsertRepeatedBytes-CopyPart- 00:09:02.289 [2024-04-26 20:04:46.623389] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.289 #16 NEW cov: 10829 ft: 14178 corp: 3/17b lim: 8 exec/s: 0 rss: 71Mb L: 8/8 MS: 1 CopyPart- 00:09:02.547 [2024-04-26 20:04:46.816893] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.547 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:02.547 #27 NEW cov: 10846 ft: 14696 corp: 4/25b lim: 8 exec/s: 0 rss: 72Mb L: 8/8 MS: 1 CopyPart- 00:09:02.805 [2024-04-26 20:04:47.005022] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:02.805 #28 NEW cov: 10846 ft: 15812 corp: 5/33b lim: 8 exec/s: 28 rss: 72Mb L: 8/8 MS: 1 ChangeByte- 00:09:02.805 [2024-04-26 20:04:47.187997] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.062 #29 NEW cov: 10846 ft: 16160 corp: 6/41b lim: 8 exec/s: 29 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.062 [2024-04-26 20:04:47.368166] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.062 #30 NEW cov: 10846 ft: 16350 corp: 7/49b lim: 8 exec/s: 30 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.321 [2024-04-26 20:04:47.547978] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.321 #31 NEW cov: 10846 ft: 16669 corp: 8/57b lim: 8 exec/s: 31 rss: 72Mb L: 8/8 MS: 1 ChangeBit- 00:09:03.321 [2024-04-26 20:04:47.730749] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.579 #32 NEW cov: 10853 ft: 16773 corp: 9/65b lim: 8 exec/s: 32 rss: 72Mb L: 8/8 MS: 1 ShuffleBytes- 00:09:03.579 [2024-04-26 20:04:47.915346] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.579 #38 NEW cov: 10853 ft: 17674 corp: 10/73b lim: 8 exec/s: 38 rss: 72Mb L: 8/8 MS: 1 ChangeByte- 00:09:03.837 [2024-04-26 20:04:48.096916] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:09:03.837 #39 NEW cov: 10853 ft: 17714 corp: 11/81b lim: 8 exec/s: 19 rss: 72Mb L: 8/8 MS: 1 ChangeASCIIInt- 00:09:03.837 #39 DONE cov: 10853 ft: 17714 corp: 11/81b lim: 8 exec/s: 19 rss: 72Mb 00:09:03.837 Done 39 runs in 2 second(s) 00:09:03.837 [2024-04-26 20:04:48.225094] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-2/domain/2: disabling controller 00:09:04.096 20:04:48 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-2 /var/tmp/suppress_vfio_fuzz 00:09:04.096 20:04:48 -- ../common.sh@72 -- # (( i++ )) 00:09:04.096 20:04:48 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:04.096 20:04:48 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:09:04.096 20:04:48 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:09:04.096 20:04:48 -- vfio/run.sh@23 -- # local timen=1 00:09:04.096 20:04:48 -- vfio/run.sh@24 -- # local core=0x1 00:09:04.096 20:04:48 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:04.096 20:04:48 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:09:04.096 20:04:48 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:09:04.096 20:04:48 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:09:04.096 20:04:48 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:09:04.096 20:04:48 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:04.096 20:04:48 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:04.096 20:04:48 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:04.096 20:04:48 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:09:04.096 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:04.096 20:04:48 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:04.096 20:04:48 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:04.096 20:04:48 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:09:04.355 [2024-04-26 20:04:48.545732] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:09:04.355 [2024-04-26 20:04:48.545810] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631057 ] 00:09:04.355 EAL: No free 2048 kB hugepages reported on node 1 00:09:04.355 [2024-04-26 20:04:48.638421] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:04.355 [2024-04-26 20:04:48.721548] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.613 INFO: Running with entropic power schedule (0xFF, 100). 00:09:04.613 INFO: Seed: 2335192278 00:09:04.613 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:09:04.613 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:09:04.613 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:09:04.613 INFO: A corpus is not provided, starting from an empty corpus 00:09:04.613 #2 INITED exec/s: 0 rss: 63Mb 00:09:04.613 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:04.613 This may also happen if the target rejected all inputs we tried so far 00:09:04.613 [2024-04-26 20:04:48.971896] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: enabling controller 00:09:04.613 [2024-04-26 20:04:49.044031] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x787878780000, 0x787878780000) fd=323 offset=0x2900000000000000 prot=0x3: Invalid argument 00:09:04.613 [2024-04-26 20:04:49.044073] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x787878780000, 0x787878780000) offset=0x2900000000000000 flags=0x3: Invalid argument 00:09:04.613 [2024-04-26 20:04:49.044084] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:04.613 [2024-04-26 20:04:49.044105] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.130 NEW_FUNC[1/637]: 0x482d80 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:124 00:09:05.130 NEW_FUNC[2/637]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:05.130 #86 NEW cov: 10827 ft: 10749 corp: 2/33b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 4 InsertByte-ChangeByte-InsertRepeatedBytes-InsertRepeatedBytes- 00:09:05.130 [2024-04-26 20:04:49.543391] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x690000000a000000, 0x690000000a696969) fd=325 offset=0xa00000000000000 prot=0x3: Permission denied 00:09:05.130 [2024-04-26 20:04:49.543431] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x690000000a000000, 0x690000000a696969) offset=0xa00000000000000 flags=0x3: Permission denied 00:09:05.130 [2024-04-26 20:04:49.543443] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:05.130 [2024-04-26 20:04:49.543460] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.389 #95 NEW cov: 10841 ft: 13878 corp: 3/65b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 4 InsertRepeatedBytes-CopyPart-CopyPart-InsertRepeatedBytes- 00:09:05.389 [2024-04-26 20:04:49.738014] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x69007e000a000000, 0x69007e000a696969) fd=325 offset=0xa00000000000000 prot=0x3: Permission denied 00:09:05.389 [2024-04-26 20:04:49.738042] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x69007e000a000000, 0x69007e000a696969) offset=0xa00000000000000 flags=0x3: Permission denied 00:09:05.389 [2024-04-26 20:04:49.738053] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:05.389 [2024-04-26 20:04:49.738087] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.648 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:05.648 #101 NEW cov: 10861 ft: 15329 corp: 4/97b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:09:05.648 [2024-04-26 20:04:49.921057] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x69007e000a000000, 0x690082000a696969) fd=325 offset=0xa00000000000000 prot=0x3: Permission denied 00:09:05.648 [2024-04-26 20:04:49.921082] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x69007e000a000000, 0x690082000a696969) offset=0xa00000000000000 flags=0x3: Permission denied 00:09:05.648 [2024-04-26 20:04:49.921093] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:05.648 [2024-04-26 20:04:49.921113] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.648 #102 NEW cov: 10861 ft: 15869 corp: 5/129b lim: 32 exec/s: 102 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:05.906 [2024-04-26 20:04:50.107726] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x787878780000, 0x787878780000) fd=325 offset=0x2900000000000000 prot=0x3: Invalid argument 00:09:05.906 [2024-04-26 20:04:50.107763] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x787878780000, 0x787878780000) offset=0x2900000000000000 flags=0x3: Invalid argument 00:09:05.906 [2024-04-26 20:04:50.107775] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:05.906 [2024-04-26 20:04:50.107793] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:05.906 #113 NEW cov: 10861 ft: 16273 corp: 6/161b lim: 32 exec/s: 113 rss: 74Mb L: 32/32 MS: 1 ChangeBit- 00:09:05.906 [2024-04-26 20:04:50.291627] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x690000000a000000, 0x690000000a696969) fd=325 offset=0xa00000000000000 prot=0x3: Permission denied 00:09:05.906 [2024-04-26 20:04:50.291659] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x690000000a000000, 0x690000000a696969) offset=0xa00000000000000 flags=0x3: Permission denied 00:09:05.906 [2024-04-26 20:04:50.291671] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:05.906 [2024-04-26 20:04:50.291689] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.164 #124 NEW cov: 10861 ft: 16418 corp: 7/193b lim: 32 exec/s: 124 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.164 [2024-04-26 20:04:50.473638] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x96ff81fff3000000, 0x96ff85fff3969696) fd=325 offset=0xa00000000000000 prot=0x3: Permission denied 00:09:06.164 [2024-04-26 20:04:50.473667] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x96ff81fff3000000, 0x96ff85fff3969696) offset=0xa00000000000000 flags=0x3: Permission denied 00:09:06.164 [2024-04-26 20:04:50.473678] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Permission denied 00:09:06.164 [2024-04-26 20:04:50.473694] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.164 #135 NEW cov: 10861 ft: 16828 corp: 8/225b lim: 32 exec/s: 135 rss: 74Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:06.423 [2024-04-26 20:04:50.654644] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x787878780000, 0x787878780000) fd=325 offset=0x2900000000000000 prot=0x3: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.654669] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x787878780000, 0x787878780000) offset=0x2900000000000000 flags=0x3: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.654680] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.654696] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.423 #136 NEW cov: 10868 ft: 17008 corp: 9/257b lim: 32 exec/s: 136 rss: 74Mb L: 32/32 MS: 1 ChangeByte- 00:09:06.423 [2024-04-26 20:04:50.838762] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to memory map DMA region [0x787878780000, 0x787878780000) fd=325 offset=0x2900000000000000 prot=0x3: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.838786] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x787878780000, 0x787878780000) offset=0x2900000000000000 flags=0x3: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.838797] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: Invalid argument 00:09:06.423 [2024-04-26 20:04:50.838813] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.681 #142 NEW cov: 10868 ft: 17040 corp: 10/289b lim: 32 exec/s: 142 rss: 74Mb L: 32/32 MS: 1 ShuffleBytes- 00:09:06.681 [2024-04-26 20:04:51.018643] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: DMA region size 7595717697371900416 > max 8796093022208 00:09:06.681 [2024-04-26 20:04:51.018666] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: failed to add DMA region [0x7e000a000000, 0x6969e70088000a00) offset=0xa00000004000069 flags=0x3: No space left on device 00:09:06.681 [2024-04-26 20:04:51.018677] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-3/domain/1: msg0: cmd 2 failed: No space left on device 00:09:06.681 [2024-04-26 20:04:51.018697] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 2 return failure 00:09:06.940 #143 NEW cov: 10868 ft: 17140 corp: 11/321b lim: 32 exec/s: 71 rss: 74Mb L: 32/32 MS: 1 CopyPart- 00:09:06.940 #143 DONE cov: 10868 ft: 17140 corp: 11/321b lim: 32 exec/s: 71 rss: 74Mb 00:09:06.940 Done 143 runs in 2 second(s) 00:09:06.940 [2024-04-26 20:04:51.147108] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-3/domain/2: disabling controller 00:09:07.199 20:04:51 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-3 /var/tmp/suppress_vfio_fuzz 00:09:07.199 20:04:51 -- ../common.sh@72 -- # (( i++ )) 00:09:07.199 20:04:51 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:07.199 20:04:51 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:09:07.199 20:04:51 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:09:07.199 20:04:51 -- vfio/run.sh@23 -- # local timen=1 00:09:07.199 20:04:51 -- vfio/run.sh@24 -- # local core=0x1 00:09:07.199 20:04:51 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:07.199 20:04:51 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:09:07.199 20:04:51 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:09:07.199 20:04:51 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:09:07.199 20:04:51 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:09:07.199 20:04:51 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:07.199 20:04:51 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:07.199 20:04:51 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:07.199 20:04:51 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:09:07.199 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:07.199 20:04:51 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:07.199 20:04:51 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:07.199 20:04:51 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:09:07.199 [2024-04-26 20:04:51.471329] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:09:07.199 [2024-04-26 20:04:51.471406] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631522 ] 00:09:07.199 EAL: No free 2048 kB hugepages reported on node 1 00:09:07.199 [2024-04-26 20:04:51.557231] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.199 [2024-04-26 20:04:51.636657] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.458 INFO: Running with entropic power schedule (0xFF, 100). 00:09:07.458 INFO: Seed: 944252772 00:09:07.458 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:09:07.458 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:09:07.458 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:09:07.458 INFO: A corpus is not provided, starting from an empty corpus 00:09:07.458 #2 INITED exec/s: 0 rss: 64Mb 00:09:07.458 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:07.458 This may also happen if the target rejected all inputs we tried so far 00:09:07.458 [2024-04-26 20:04:51.878054] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: enabling controller 00:09:07.974 NEW_FUNC[1/636]: 0x483600 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:144 00:09:07.974 NEW_FUNC[2/636]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:07.974 #418 NEW cov: 10825 ft: 10412 corp: 2/33b lim: 32 exec/s: 0 rss: 70Mb L: 32/32 MS: 1 InsertRepeatedBytes- 00:09:08.231 #419 NEW cov: 10839 ft: 13614 corp: 3/65b lim: 32 exec/s: 0 rss: 71Mb L: 32/32 MS: 1 CopyPart- 00:09:08.490 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:08.490 #420 NEW cov: 10856 ft: 15465 corp: 4/97b lim: 32 exec/s: 0 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:09:08.490 #421 NEW cov: 10856 ft: 16437 corp: 5/129b lim: 32 exec/s: 421 rss: 72Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:08.748 #427 NEW cov: 10856 ft: 16940 corp: 6/161b lim: 32 exec/s: 427 rss: 72Mb L: 32/32 MS: 1 ChangeByte- 00:09:09.012 #428 NEW cov: 10856 ft: 17241 corp: 7/193b lim: 32 exec/s: 428 rss: 72Mb L: 32/32 MS: 1 CopyPart- 00:09:09.270 #431 NEW cov: 10856 ft: 17426 corp: 8/225b lim: 32 exec/s: 431 rss: 73Mb L: 32/32 MS: 3 EraseBytes-ChangeBit-CrossOver- 00:09:09.270 #432 NEW cov: 10863 ft: 17586 corp: 9/257b lim: 32 exec/s: 432 rss: 73Mb L: 32/32 MS: 1 ChangeBinInt- 00:09:09.527 #433 NEW cov: 10863 ft: 17610 corp: 10/289b lim: 32 exec/s: 433 rss: 73Mb L: 32/32 MS: 1 CMP- DE: "\200\000\000\000\000\000\000\000"- 00:09:09.786 #434 NEW cov: 10863 ft: 17922 corp: 11/321b lim: 32 exec/s: 217 rss: 73Mb L: 32/32 MS: 1 ChangeBit- 00:09:09.786 #434 DONE cov: 10863 ft: 17922 corp: 11/321b lim: 32 exec/s: 217 rss: 73Mb 00:09:09.786 ###### Recommended dictionary. ###### 00:09:09.786 "\200\000\000\000\000\000\000\000" # Uses: 0 00:09:09.786 ###### End of recommended dictionary. ###### 00:09:09.786 Done 434 runs in 2 second(s) 00:09:09.786 [2024-04-26 20:04:54.047116] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-4/domain/2: disabling controller 00:09:10.044 20:04:54 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-4 /var/tmp/suppress_vfio_fuzz 00:09:10.044 20:04:54 -- ../common.sh@72 -- # (( i++ )) 00:09:10.044 20:04:54 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:10.044 20:04:54 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:09:10.044 20:04:54 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:09:10.044 20:04:54 -- vfio/run.sh@23 -- # local timen=1 00:09:10.044 20:04:54 -- vfio/run.sh@24 -- # local core=0x1 00:09:10.044 20:04:54 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.044 20:04:54 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:09:10.044 20:04:54 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:09:10.044 20:04:54 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:09:10.044 20:04:54 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:09:10.044 20:04:54 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:10.044 20:04:54 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:10.044 20:04:54 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.044 20:04:54 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:09:10.044 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:10.044 20:04:54 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:10.044 20:04:54 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:10.044 20:04:54 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:09:10.044 [2024-04-26 20:04:54.345517] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:09:10.044 [2024-04-26 20:04:54.345605] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1631882 ] 00:09:10.044 EAL: No free 2048 kB hugepages reported on node 1 00:09:10.044 [2024-04-26 20:04:54.431234] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:10.303 [2024-04-26 20:04:54.513016] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:10.303 INFO: Running with entropic power schedule (0xFF, 100). 00:09:10.303 INFO: Seed: 3832243467 00:09:10.303 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:09:10.303 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:09:10.303 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:09:10.303 INFO: A corpus is not provided, starting from an empty corpus 00:09:10.303 #2 INITED exec/s: 0 rss: 63Mb 00:09:10.303 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:10.303 This may also happen if the target rejected all inputs we tried so far 00:09:10.561 [2024-04-26 20:04:54.764945] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: enabling controller 00:09:10.561 [2024-04-26 20:04:54.812903] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:10.561 [2024-04-26 20:04:54.812939] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:10.819 NEW_FUNC[1/637]: 0x484000 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:171 00:09:10.819 NEW_FUNC[2/637]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:10.819 #33 NEW cov: 10816 ft: 10779 corp: 2/14b lim: 13 exec/s: 0 rss: 70Mb L: 13/13 MS: 1 InsertRepeatedBytes- 00:09:11.077 [2024-04-26 20:04:55.290909] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.077 [2024-04-26 20:04:55.290952] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.077 #37 NEW cov: 10845 ft: 13710 corp: 3/27b lim: 13 exec/s: 0 rss: 71Mb L: 13/13 MS: 4 ChangeBit-CopyPart-CrossOver-InsertRepeatedBytes- 00:09:11.077 [2024-04-26 20:04:55.472436] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.077 [2024-04-26 20:04:55.472472] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.336 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:11.336 #54 NEW cov: 10865 ft: 15387 corp: 4/40b lim: 13 exec/s: 0 rss: 72Mb L: 13/13 MS: 2 EraseBytes-CopyPart- 00:09:11.336 [2024-04-26 20:04:55.644665] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.336 [2024-04-26 20:04:55.644700] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.336 #65 NEW cov: 10865 ft: 16137 corp: 5/53b lim: 13 exec/s: 65 rss: 72Mb L: 13/13 MS: 1 CrossOver- 00:09:11.615 [2024-04-26 20:04:55.816227] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.615 [2024-04-26 20:04:55.816259] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.615 #66 NEW cov: 10865 ft: 16185 corp: 6/66b lim: 13 exec/s: 66 rss: 72Mb L: 13/13 MS: 1 ChangeByte- 00:09:11.615 [2024-04-26 20:04:55.986393] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.615 [2024-04-26 20:04:55.986423] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.873 #72 NEW cov: 10865 ft: 16320 corp: 7/79b lim: 13 exec/s: 72 rss: 72Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:11.873 [2024-04-26 20:04:56.157759] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:11.873 [2024-04-26 20:04:56.157789] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:11.873 #74 NEW cov: 10865 ft: 16856 corp: 8/92b lim: 13 exec/s: 74 rss: 72Mb L: 13/13 MS: 2 EraseBytes-CrossOver- 00:09:12.130 [2024-04-26 20:04:56.327778] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.130 [2024-04-26 20:04:56.327808] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.130 #75 NEW cov: 10865 ft: 16970 corp: 9/105b lim: 13 exec/s: 75 rss: 72Mb L: 13/13 MS: 1 ChangeBit- 00:09:12.130 [2024-04-26 20:04:56.500214] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.131 [2024-04-26 20:04:56.500244] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.388 #81 NEW cov: 10872 ft: 17158 corp: 10/118b lim: 13 exec/s: 81 rss: 73Mb L: 13/13 MS: 1 ChangeBit- 00:09:12.388 [2024-04-26 20:04:56.671769] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:12.388 [2024-04-26 20:04:56.671800] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:12.388 #82 NEW cov: 10872 ft: 17441 corp: 11/131b lim: 13 exec/s: 41 rss: 73Mb L: 13/13 MS: 1 ChangeBinInt- 00:09:12.388 #82 DONE cov: 10872 ft: 17441 corp: 11/131b lim: 13 exec/s: 41 rss: 73Mb 00:09:12.388 Done 82 runs in 2 second(s) 00:09:12.388 [2024-04-26 20:04:56.797098] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-5/domain/2: disabling controller 00:09:12.647 20:04:57 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-5 /var/tmp/suppress_vfio_fuzz 00:09:12.647 20:04:57 -- ../common.sh@72 -- # (( i++ )) 00:09:12.647 20:04:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:12.647 20:04:57 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:09:12.647 20:04:57 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:09:12.647 20:04:57 -- vfio/run.sh@23 -- # local timen=1 00:09:12.647 20:04:57 -- vfio/run.sh@24 -- # local core=0x1 00:09:12.647 20:04:57 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:12.647 20:04:57 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:09:12.647 20:04:57 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:09:12.647 20:04:57 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:09:12.647 20:04:57 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:09:12.647 20:04:57 -- vfio/run.sh@30 -- # local suppress_file=/var/tmp/suppress_vfio_fuzz 00:09:12.647 20:04:57 -- vfio/run.sh@34 -- # local LSAN_OPTIONS=report_objects=1:suppressions=/var/tmp/suppress_vfio_fuzz:print_suppressions=0 00:09:12.647 20:04:57 -- vfio/run.sh@36 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:12.647 20:04:57 -- vfio/run.sh@39 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:09:12.647 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:09:12.647 20:04:57 -- vfio/run.sh@43 -- # echo leak:spdk_nvmf_qpair_disconnect 00:09:12.647 20:04:57 -- vfio/run.sh@44 -- # echo leak:nvmf_ctrlr_create 00:09:12.647 20:04:57 -- vfio/run.sh@47 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:09:12.906 [2024-04-26 20:04:57.092931] Starting SPDK v24.05-pre git sha1 13a9f2aa2 / DPDK 23.11.0 initialization... 00:09:12.906 [2024-04-26 20:04:57.093006] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1632240 ] 00:09:12.906 EAL: No free 2048 kB hugepages reported on node 1 00:09:12.906 [2024-04-26 20:04:57.177923] app.c: 828:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:12.906 [2024-04-26 20:04:57.257656] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.163 INFO: Running with entropic power schedule (0xFF, 100). 00:09:13.163 INFO: Seed: 2279272244 00:09:13.163 INFO: Loaded 1 modules (345776 inline 8-bit counters): 345776 [0x28775cc, 0x28cbc7c), 00:09:13.163 INFO: Loaded 1 PC tables (345776 PCs): 345776 [0x28cbc80,0x2e12780), 00:09:13.163 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:09:13.163 INFO: A corpus is not provided, starting from an empty corpus 00:09:13.163 #2 INITED exec/s: 0 rss: 63Mb 00:09:13.163 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:09:13.163 This may also happen if the target rejected all inputs we tried so far 00:09:13.163 [2024-04-26 20:04:57.508765] vfio_user.c:2836:enable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: enabling controller 00:09:13.163 [2024-04-26 20:04:57.555975] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.163 [2024-04-26 20:04:57.556011] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.677 NEW_FUNC[1/637]: 0x484cf0 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:09:13.677 NEW_FUNC[2/637]: 0x487230 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:09:13.677 #65 NEW cov: 10826 ft: 10597 corp: 2/10b lim: 9 exec/s: 0 rss: 70Mb L: 9/9 MS: 3 CrossOver-CrossOver-InsertRepeatedBytes- 00:09:13.677 [2024-04-26 20:04:58.043853] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.677 [2024-04-26 20:04:58.043911] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.935 #66 NEW cov: 10840 ft: 13567 corp: 3/19b lim: 9 exec/s: 0 rss: 71Mb L: 9/9 MS: 1 CopyPart- 00:09:13.935 [2024-04-26 20:04:58.215500] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:13.935 [2024-04-26 20:04:58.215538] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:13.935 NEW_FUNC[1/1]: 0x19904a0 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:09:13.935 #72 NEW cov: 10857 ft: 15495 corp: 4/28b lim: 9 exec/s: 0 rss: 72Mb L: 9/9 MS: 1 CrossOver- 00:09:14.192 [2024-04-26 20:04:58.396595] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.192 [2024-04-26 20:04:58.396626] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.192 #73 NEW cov: 10857 ft: 16186 corp: 5/37b lim: 9 exec/s: 73 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:09:14.192 [2024-04-26 20:04:58.565801] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.192 [2024-04-26 20:04:58.565830] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.449 #74 NEW cov: 10857 ft: 16297 corp: 6/46b lim: 9 exec/s: 74 rss: 72Mb L: 9/9 MS: 1 ChangeByte- 00:09:14.449 [2024-04-26 20:04:58.734166] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.449 [2024-04-26 20:04:58.734196] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.449 #75 NEW cov: 10857 ft: 16636 corp: 7/55b lim: 9 exec/s: 75 rss: 72Mb L: 9/9 MS: 1 CopyPart- 00:09:14.708 [2024-04-26 20:04:58.912466] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.708 [2024-04-26 20:04:58.912497] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.708 #76 NEW cov: 10857 ft: 16675 corp: 8/64b lim: 9 exec/s: 76 rss: 72Mb L: 9/9 MS: 1 ChangeBit- 00:09:14.708 [2024-04-26 20:04:59.081165] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.708 [2024-04-26 20:04:59.081194] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.966 #82 NEW cov: 10857 ft: 16691 corp: 9/73b lim: 9 exec/s: 82 rss: 73Mb L: 9/9 MS: 1 ChangeBinInt- 00:09:14.966 [2024-04-26 20:04:59.251379] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:14.966 [2024-04-26 20:04:59.251410] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:14.966 #84 NEW cov: 10864 ft: 16894 corp: 10/82b lim: 9 exec/s: 84 rss: 73Mb L: 9/9 MS: 2 EraseBytes-InsertByte- 00:09:15.224 [2024-04-26 20:04:59.422661] vfio_user.c:3106:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:09:15.224 [2024-04-26 20:04:59.422694] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:09:15.224 #86 NEW cov: 10864 ft: 16929 corp: 11/91b lim: 9 exec/s: 43 rss: 73Mb L: 9/9 MS: 2 EraseBytes-CrossOver- 00:09:15.224 #86 DONE cov: 10864 ft: 16929 corp: 11/91b lim: 9 exec/s: 43 rss: 73Mb 00:09:15.224 Done 86 runs in 2 second(s) 00:09:15.224 [2024-04-26 20:04:59.542077] vfio_user.c:2798:disable_ctrlr: *NOTICE*: /tmp/vfio-user-6/domain/2: disabling controller 00:09:15.483 20:04:59 -- vfio/run.sh@58 -- # rm -rf /tmp/vfio-user-6 /var/tmp/suppress_vfio_fuzz 00:09:15.483 20:04:59 -- ../common.sh@72 -- # (( i++ )) 00:09:15.483 20:04:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:09:15.483 20:04:59 -- vfio/run.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:09:15.483 00:09:15.483 real 0m20.104s 00:09:15.483 user 0m27.608s 00:09:15.483 sys 0m2.058s 00:09:15.483 20:04:59 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:15.483 20:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:15.483 ************************************ 00:09:15.483 END TEST vfio_fuzz 00:09:15.483 ************************************ 00:09:15.483 20:04:59 -- fuzz/llvm.sh@67 -- # [[ 1 -eq 0 ]] 00:09:15.483 00:09:15.483 real 1m26.104s 00:09:15.483 user 2m8.317s 00:09:15.483 sys 0m10.573s 00:09:15.483 20:04:59 -- common/autotest_common.sh@1122 -- # xtrace_disable 00:09:15.483 20:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:15.483 ************************************ 00:09:15.483 END TEST llvm_fuzz 00:09:15.483 ************************************ 00:09:15.483 20:04:59 -- spdk/autotest.sh@373 -- # [[ 0 -eq 1 ]] 00:09:15.483 20:04:59 -- spdk/autotest.sh@378 -- # trap - SIGINT SIGTERM EXIT 00:09:15.483 20:04:59 -- spdk/autotest.sh@380 -- # timing_enter post_cleanup 00:09:15.483 20:04:59 -- common/autotest_common.sh@720 -- # xtrace_disable 00:09:15.483 20:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:15.483 20:04:59 -- spdk/autotest.sh@381 -- # autotest_cleanup 00:09:15.483 20:04:59 -- common/autotest_common.sh@1388 -- # local autotest_es=0 00:09:15.483 20:04:59 -- common/autotest_common.sh@1389 -- # xtrace_disable 00:09:15.483 20:04:59 -- common/autotest_common.sh@10 -- # set +x 00:09:20.823 INFO: APP EXITING 00:09:20.823 INFO: killing all VMs 00:09:20.823 INFO: killing vhost app 00:09:20.823 INFO: EXIT DONE 00:09:24.118 Waiting for block devices as requested 00:09:24.118 0000:1a:00.0 (8086 0a54): vfio-pci -> nvme 00:09:24.118 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:24.118 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:24.118 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:24.118 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:24.377 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:24.377 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:24.377 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:24.377 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:24.635 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:09:24.635 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:09:24.635 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:09:24.894 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:09:24.894 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:09:24.894 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:09:25.154 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:09:25.154 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:09:31.723 Cleaning 00:09:31.723 Removing: /dev/shm/spdk_tgt_trace.pid1602483 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1599957 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1601160 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1602483 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1603138 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1603963 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1604145 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1604922 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1605100 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1605446 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1605678 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1605957 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1606342 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1606595 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1606805 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1607003 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1607278 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1608116 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1610530 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1610914 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1611129 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1611301 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1611707 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1611872 00:09:31.723 Removing: /var/run/dpdk/spdk_pid1612281 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1612457 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1612668 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1612846 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1613056 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1613076 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1613540 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1613742 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1613943 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1614182 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1614417 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1614612 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1614828 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1615064 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1615284 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1615584 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1615805 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1616347 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1616766 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1616966 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1617177 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1617382 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1617642 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1617951 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1618153 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1618358 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1618560 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1618772 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1619063 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1619336 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1619551 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1619749 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1619959 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1620190 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1620461 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1621045 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1621400 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1621756 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1622115 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1622469 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1622831 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1623184 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1623546 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1623908 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1624262 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1624621 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1624987 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1625340 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1625702 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1626055 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1626417 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1626734 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1627055 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1627352 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1627688 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1628043 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1628403 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1628757 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1629114 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1629471 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1629918 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1630274 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1630635 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1631057 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1631522 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1631882 00:09:31.724 Removing: /var/run/dpdk/spdk_pid1632240 00:09:31.724 Clean 00:09:31.724 20:05:15 -- common/autotest_common.sh@1447 -- # return 0 00:09:31.724 20:05:15 -- spdk/autotest.sh@382 -- # timing_exit post_cleanup 00:09:31.724 20:05:15 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:31.724 20:05:15 -- common/autotest_common.sh@10 -- # set +x 00:09:31.724 20:05:15 -- spdk/autotest.sh@384 -- # timing_exit autotest 00:09:31.724 20:05:15 -- common/autotest_common.sh@726 -- # xtrace_disable 00:09:31.724 20:05:15 -- common/autotest_common.sh@10 -- # set +x 00:09:31.724 20:05:15 -- spdk/autotest.sh@385 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:31.724 20:05:15 -- spdk/autotest.sh@387 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:31.724 20:05:15 -- spdk/autotest.sh@387 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:31.724 20:05:15 -- spdk/autotest.sh@389 -- # hash lcov 00:09:31.724 20:05:15 -- spdk/autotest.sh@389 -- # [[ CC_TYPE=clang == *\c\l\a\n\g* ]] 00:09:31.724 20:05:15 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:31.724 20:05:15 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:31.724 20:05:15 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:31.724 20:05:15 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:31.724 20:05:15 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.724 20:05:15 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.724 20:05:15 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.724 20:05:15 -- paths/export.sh@5 -- $ export PATH 00:09:31.724 20:05:15 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:31.724 20:05:15 -- common/autobuild_common.sh@434 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:31.724 20:05:15 -- common/autobuild_common.sh@435 -- $ date +%s 00:09:31.724 20:05:15 -- common/autobuild_common.sh@435 -- $ mktemp -dt spdk_1714154715.XXXXXX 00:09:31.724 20:05:15 -- common/autobuild_common.sh@435 -- $ SPDK_WORKSPACE=/tmp/spdk_1714154715.n9wZR9 00:09:31.724 20:05:15 -- common/autobuild_common.sh@437 -- $ [[ -n '' ]] 00:09:31.724 20:05:15 -- common/autobuild_common.sh@441 -- $ '[' -n '' ']' 00:09:31.724 20:05:15 -- common/autobuild_common.sh@444 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:31.724 20:05:15 -- common/autobuild_common.sh@448 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:31.724 20:05:15 -- common/autobuild_common.sh@450 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:31.724 20:05:15 -- common/autobuild_common.sh@451 -- $ get_config_params 00:09:31.724 20:05:15 -- common/autotest_common.sh@395 -- $ xtrace_disable 00:09:31.724 20:05:15 -- common/autotest_common.sh@10 -- $ set +x 00:09:31.724 20:05:15 -- common/autobuild_common.sh@451 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:31.724 20:05:15 -- common/autobuild_common.sh@453 -- $ start_monitor_resources 00:09:31.724 20:05:15 -- pm/common@17 -- $ local monitor 00:09:31.724 20:05:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:31.724 20:05:15 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1638410 00:09:31.724 20:05:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:31.724 20:05:15 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1638413 00:09:31.724 20:05:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:31.724 20:05:15 -- pm/common@21 -- $ date +%s 00:09:31.724 20:05:15 -- pm/common@21 -- $ date +%s 00:09:31.724 20:05:15 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1638416 00:09:31.724 20:05:15 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:31.724 20:05:15 -- pm/common@23 -- $ MONITOR_RESOURCES_PIDS["$monitor"]=1638421 00:09:31.724 20:05:15 -- pm/common@21 -- $ date +%s 00:09:31.724 20:05:15 -- pm/common@26 -- $ sleep 1 00:09:31.724 20:05:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714154715 00:09:31.724 20:05:15 -- pm/common@21 -- $ date +%s 00:09:31.724 20:05:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714154715 00:09:31.724 20:05:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714154715 00:09:31.724 20:05:15 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1714154715 00:09:31.724 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714154715_collect-vmstat.pm.log 00:09:31.724 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714154715_collect-cpu-load.pm.log 00:09:31.724 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714154715_collect-bmc-pm.bmc.pm.log 00:09:31.724 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1714154715_collect-cpu-temp.pm.log 00:09:32.292 20:05:16 -- common/autobuild_common.sh@454 -- $ trap stop_monitor_resources EXIT 00:09:32.292 20:05:16 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:09:32.292 20:05:16 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:32.292 20:05:16 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:32.292 20:05:16 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:32.292 20:05:16 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:32.292 20:05:16 -- common/autotest_common.sh@732 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:32.292 20:05:16 -- common/autotest_common.sh@733 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:32.292 20:05:16 -- common/autotest_common.sh@735 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:32.551 20:05:16 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:32.551 20:05:16 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:09:32.551 20:05:16 -- pm/common@30 -- $ signal_monitor_resources TERM 00:09:32.551 20:05:16 -- pm/common@41 -- $ local monitor pid pids signal=TERM 00:09:32.551 20:05:16 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:32.551 20:05:16 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:09:32.551 20:05:16 -- pm/common@45 -- $ pid=1638439 00:09:32.551 20:05:16 -- pm/common@52 -- $ sudo kill -TERM 1638439 00:09:32.551 20:05:16 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:32.551 20:05:16 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:09:32.551 20:05:16 -- pm/common@45 -- $ pid=1638443 00:09:32.551 20:05:16 -- pm/common@52 -- $ sudo kill -TERM 1638443 00:09:32.551 20:05:16 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:32.551 20:05:16 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:09:32.551 20:05:16 -- pm/common@45 -- $ pid=1638446 00:09:32.551 20:05:16 -- pm/common@52 -- $ sudo kill -TERM 1638446 00:09:32.551 20:05:16 -- pm/common@43 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:09:32.551 20:05:16 -- pm/common@44 -- $ [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:09:32.551 20:05:16 -- pm/common@45 -- $ pid=1638448 00:09:32.551 20:05:16 -- pm/common@52 -- $ sudo kill -TERM 1638448 00:09:32.551 + [[ -n 1492022 ]] 00:09:32.551 + sudo kill 1492022 00:09:32.560 [Pipeline] } 00:09:32.577 [Pipeline] // stage 00:09:32.581 [Pipeline] } 00:09:32.596 [Pipeline] // timeout 00:09:32.601 [Pipeline] } 00:09:32.615 [Pipeline] // catchError 00:09:32.621 [Pipeline] } 00:09:32.637 [Pipeline] // wrap 00:09:32.643 [Pipeline] } 00:09:32.658 [Pipeline] // catchError 00:09:32.666 [Pipeline] stage 00:09:32.668 [Pipeline] { (Epilogue) 00:09:32.684 [Pipeline] catchError 00:09:32.685 [Pipeline] { 00:09:32.703 [Pipeline] echo 00:09:32.705 Cleanup processes 00:09:32.710 [Pipeline] sh 00:09:32.995 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:32.995 1549847 sudo -E /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:09:32.995 1549871 bash /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1714154293 00:09:32.995 1638582 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/sdr.cache 00:09:32.995 1639332 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:33.009 [Pipeline] sh 00:09:33.294 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:33.294 ++ grep -v 'sudo pgrep' 00:09:33.294 ++ awk '{print $1}' 00:09:33.294 + sudo kill -9 1638582 00:09:33.307 [Pipeline] sh 00:09:33.592 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:34.541 [Pipeline] sh 00:09:34.825 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:09:34.825 Artifacts sizes are good 00:09:34.839 [Pipeline] archiveArtifacts 00:09:34.846 Archiving artifacts 00:09:34.928 [Pipeline] sh 00:09:35.213 + sudo chown -R sys_sgci /var/jenkins/workspace/short-fuzz-phy-autotest 00:09:35.228 [Pipeline] cleanWs 00:09:35.238 [WS-CLEANUP] Deleting project workspace... 00:09:35.239 [WS-CLEANUP] Deferred wipeout is used... 00:09:35.245 [WS-CLEANUP] done 00:09:35.248 [Pipeline] } 00:09:35.281 [Pipeline] // catchError 00:09:35.294 [Pipeline] sh 00:09:35.578 + logger -p user.info -t JENKINS-CI 00:09:35.587 [Pipeline] } 00:09:35.605 [Pipeline] // stage 00:09:35.610 [Pipeline] } 00:09:35.628 [Pipeline] // node 00:09:35.633 [Pipeline] End of Pipeline 00:09:35.661 Finished: SUCCESS