00:00:00.001 Started by upstream project "autotest-nightly-lts" build number 2357 00:00:00.001 originally caused by: 00:00:00.001 Started by upstream project "nightly-trigger" build number 3618 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.014 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.016 The recommended git tool is: git 00:00:00.016 using credential 00000000-0000-0000-0000-000000000002 00:00:00.023 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/short-fuzz-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.041 Fetching changes from the remote Git repository 00:00:00.043 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.061 Using shallow fetch with depth 1 00:00:00.061 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.061 > git --version # timeout=10 00:00:00.082 > git --version # 'git version 2.39.2' 00:00:00.082 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.118 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.118 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.234 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.246 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.257 Checking out Revision b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf (FETCH_HEAD) 00:00:02.257 > git config core.sparsecheckout # timeout=10 00:00:02.267 > git read-tree -mu HEAD # timeout=10 00:00:02.290 > git checkout -f b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=5 00:00:02.308 Commit message: "jenkins/jjb-config: Ignore OS version mismatch under freebsd" 00:00:02.308 > git rev-list --no-walk b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf # timeout=10 00:00:02.427 [Pipeline] Start of Pipeline 00:00:02.442 [Pipeline] library 00:00:02.444 Loading library shm_lib@master 00:00:02.444 Library shm_lib@master is cached. Copying from home. 00:00:02.462 [Pipeline] node 00:00:02.478 Running on WFP20 in /var/jenkins/workspace/short-fuzz-phy-autotest 00:00:02.480 [Pipeline] { 00:00:02.490 [Pipeline] catchError 00:00:02.492 [Pipeline] { 00:00:02.505 [Pipeline] wrap 00:00:02.513 [Pipeline] { 00:00:02.522 [Pipeline] stage 00:00:02.523 [Pipeline] { (Prologue) 00:00:02.702 [Pipeline] sh 00:00:02.983 + logger -p user.info -t JENKINS-CI 00:00:02.997 [Pipeline] echo 00:00:02.998 Node: WFP20 00:00:03.003 [Pipeline] sh 00:00:03.299 [Pipeline] setCustomBuildProperty 00:00:03.309 [Pipeline] echo 00:00:03.310 Cleanup processes 00:00:03.314 [Pipeline] sh 00:00:03.592 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.593 3559534 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.604 [Pipeline] sh 00:00:03.888 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:00:03.888 ++ grep -v 'sudo pgrep' 00:00:03.888 ++ awk '{print $1}' 00:00:03.888 + sudo kill -9 00:00:03.888 + true 00:00:03.900 [Pipeline] cleanWs 00:00:03.909 [WS-CLEANUP] Deleting project workspace... 00:00:03.909 [WS-CLEANUP] Deferred wipeout is used... 00:00:03.915 [WS-CLEANUP] done 00:00:03.918 [Pipeline] setCustomBuildProperty 00:00:03.929 [Pipeline] sh 00:00:04.207 + sudo git config --global --replace-all safe.directory '*' 00:00:04.305 [Pipeline] httpRequest 00:00:04.952 [Pipeline] echo 00:00:04.954 Sorcerer 10.211.164.101 is alive 00:00:04.960 [Pipeline] retry 00:00:04.961 [Pipeline] { 00:00:04.971 [Pipeline] httpRequest 00:00:04.974 HttpMethod: GET 00:00:04.975 URL: http://10.211.164.101/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:04.975 Sending request to url: http://10.211.164.101/packages/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:04.976 Response Code: HTTP/1.1 200 OK 00:00:04.977 Success: Status code 200 is in the accepted range: 200,404 00:00:04.977 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.177 [Pipeline] } 00:00:05.193 [Pipeline] // retry 00:00:05.199 [Pipeline] sh 00:00:05.476 + tar --no-same-owner -xf jbp_b9dd3f7ec12b0ee8a44940dc99ce739345caa4cf.tar.gz 00:00:05.490 [Pipeline] httpRequest 00:00:05.802 [Pipeline] echo 00:00:05.803 Sorcerer 10.211.164.101 is alive 00:00:05.810 [Pipeline] retry 00:00:05.811 [Pipeline] { 00:00:05.822 [Pipeline] httpRequest 00:00:05.826 HttpMethod: GET 00:00:05.826 URL: http://10.211.164.101/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:05.826 Sending request to url: http://10.211.164.101/packages/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:00:05.835 Response Code: HTTP/1.1 200 OK 00:00:05.836 Success: Status code 200 is in the accepted range: 200,404 00:00:05.836 Saving response body to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:12.994 [Pipeline] } 00:01:13.012 [Pipeline] // retry 00:01:13.020 [Pipeline] sh 00:01:13.307 + tar --no-same-owner -xf spdk_c13c99a5eba3bff912124706e0ae1d70defef44d.tar.gz 00:01:15.862 [Pipeline] sh 00:01:16.146 + git -C spdk log --oneline -n5 00:01:16.146 c13c99a5e test: Various fixes for Fedora40 00:01:16.146 726a04d70 test/nvmf: adjust timeout for bigger nvmes 00:01:16.146 61c96acfb dpdk: Point dpdk submodule at a latest fix from spdk-23.11 00:01:16.146 7db6dcdb8 nvme/fio_plugin: update the way ruhs descriptors are fetched 00:01:16.146 ff6f5c41e nvme/fio_plugin: trim add support for multiple ranges 00:01:16.157 [Pipeline] } 00:01:16.171 [Pipeline] // stage 00:01:16.180 [Pipeline] stage 00:01:16.182 [Pipeline] { (Prepare) 00:01:16.200 [Pipeline] writeFile 00:01:16.216 [Pipeline] sh 00:01:16.501 + logger -p user.info -t JENKINS-CI 00:01:16.513 [Pipeline] sh 00:01:16.795 + logger -p user.info -t JENKINS-CI 00:01:16.807 [Pipeline] sh 00:01:17.090 + cat autorun-spdk.conf 00:01:17.090 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:17.090 SPDK_TEST_FUZZER_SHORT=1 00:01:17.090 SPDK_TEST_FUZZER=1 00:01:17.090 SPDK_RUN_UBSAN=1 00:01:17.098 RUN_NIGHTLY=1 00:01:17.102 [Pipeline] readFile 00:01:17.126 [Pipeline] withEnv 00:01:17.128 [Pipeline] { 00:01:17.140 [Pipeline] sh 00:01:17.425 + set -ex 00:01:17.425 + [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf ]] 00:01:17.425 + source /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:17.425 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:17.425 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:17.425 ++ SPDK_TEST_FUZZER=1 00:01:17.425 ++ SPDK_RUN_UBSAN=1 00:01:17.425 ++ RUN_NIGHTLY=1 00:01:17.425 + case $SPDK_TEST_NVMF_NICS in 00:01:17.425 + DRIVERS= 00:01:17.425 + [[ -n '' ]] 00:01:17.425 + exit 0 00:01:17.434 [Pipeline] } 00:01:17.450 [Pipeline] // withEnv 00:01:17.455 [Pipeline] } 00:01:17.469 [Pipeline] // stage 00:01:17.478 [Pipeline] catchError 00:01:17.480 [Pipeline] { 00:01:17.493 [Pipeline] timeout 00:01:17.493 Timeout set to expire in 30 min 00:01:17.495 [Pipeline] { 00:01:17.509 [Pipeline] stage 00:01:17.511 [Pipeline] { (Tests) 00:01:17.525 [Pipeline] sh 00:01:17.810 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:17.811 ++ readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:17.811 + DIR_ROOT=/var/jenkins/workspace/short-fuzz-phy-autotest 00:01:17.811 + [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest ]] 00:01:17.811 + DIR_SPDK=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:17.811 + DIR_OUTPUT=/var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:17.811 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk ]] 00:01:17.811 + [[ ! -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:17.811 + mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/output 00:01:17.811 + [[ -d /var/jenkins/workspace/short-fuzz-phy-autotest/output ]] 00:01:17.811 + [[ short-fuzz-phy-autotest == pkgdep-* ]] 00:01:17.811 + cd /var/jenkins/workspace/short-fuzz-phy-autotest 00:01:17.811 + source /etc/os-release 00:01:17.811 ++ NAME='Fedora Linux' 00:01:17.811 ++ VERSION='39 (Cloud Edition)' 00:01:17.811 ++ ID=fedora 00:01:17.811 ++ VERSION_ID=39 00:01:17.811 ++ VERSION_CODENAME= 00:01:17.811 ++ PLATFORM_ID=platform:f39 00:01:17.811 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:01:17.811 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:17.811 ++ LOGO=fedora-logo-icon 00:01:17.811 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:01:17.811 ++ HOME_URL=https://fedoraproject.org/ 00:01:17.811 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:01:17.811 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:17.811 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:17.811 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:17.811 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:01:17.811 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:17.811 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:01:17.811 ++ SUPPORT_END=2024-11-12 00:01:17.811 ++ VARIANT='Cloud Edition' 00:01:17.811 ++ VARIANT_ID=cloud 00:01:17.811 + uname -a 00:01:17.811 Linux spdk-wfp-20 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:01:17.811 + sudo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:01:20.341 Hugepages 00:01:20.341 node hugesize free / total 00:01:20.341 node0 1048576kB 0 / 0 00:01:20.341 node0 2048kB 0 / 0 00:01:20.341 node1 1048576kB 0 / 0 00:01:20.341 node1 2048kB 0 / 0 00:01:20.341 00:01:20.341 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:20.341 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:20.341 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:20.341 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:01:20.341 + rm -f /tmp/spdk-ld-path 00:01:20.341 + source autorun-spdk.conf 00:01:20.341 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:20.341 ++ SPDK_TEST_FUZZER_SHORT=1 00:01:20.341 ++ SPDK_TEST_FUZZER=1 00:01:20.342 ++ SPDK_RUN_UBSAN=1 00:01:20.342 ++ RUN_NIGHTLY=1 00:01:20.342 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:20.342 + [[ -n '' ]] 00:01:20.342 + sudo git config --global --add safe.directory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:20.342 + for M in /var/spdk/build-*-manifest.txt 00:01:20.342 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:01:20.342 + cp /var/spdk/build-kernel-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:20.342 + for M in /var/spdk/build-*-manifest.txt 00:01:20.342 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:20.342 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:20.342 + for M in /var/spdk/build-*-manifest.txt 00:01:20.342 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:20.342 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/short-fuzz-phy-autotest/output/ 00:01:20.342 ++ uname 00:01:20.342 + [[ Linux == \L\i\n\u\x ]] 00:01:20.342 + sudo dmesg -T 00:01:20.342 + sudo dmesg --clear 00:01:20.342 + dmesg_pid=3560424 00:01:20.342 + [[ Fedora Linux == FreeBSD ]] 00:01:20.342 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:20.342 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:20.342 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:20.342 + [[ -x /usr/src/fio-static/fio ]] 00:01:20.342 + sudo dmesg -Tw 00:01:20.342 + export FIO_BIN=/usr/src/fio-static/fio 00:01:20.342 + FIO_BIN=/usr/src/fio-static/fio 00:01:20.342 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\s\h\o\r\t\-\f\u\z\z\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:20.342 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:20.342 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:20.342 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:20.342 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:20.342 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:20.342 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:20.342 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:20.342 + spdk/autorun.sh /var/jenkins/workspace/short-fuzz-phy-autotest/autorun-spdk.conf 00:01:20.342 Test configuration: 00:01:20.342 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:20.342 SPDK_TEST_FUZZER_SHORT=1 00:01:20.342 SPDK_TEST_FUZZER=1 00:01:20.342 SPDK_RUN_UBSAN=1 00:01:20.342 RUN_NIGHTLY=1 04:44:55 -- common/autotest_common.sh@1689 -- $ [[ n == y ]] 00:01:20.342 04:44:55 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:01:20.342 04:44:55 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:20.602 04:44:55 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:20.602 04:44:55 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:20.602 04:44:55 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.602 04:44:55 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.603 04:44:55 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.603 04:44:55 -- paths/export.sh@5 -- $ export PATH 00:01:20.603 04:44:55 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:20.603 04:44:55 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:01:20.603 04:44:55 -- common/autobuild_common.sh@440 -- $ date +%s 00:01:20.603 04:44:55 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731037495.XXXXXX 00:01:20.603 04:44:55 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731037495.uhtfbU 00:01:20.603 04:44:55 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:01:20.603 04:44:55 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:01:20.603 04:44:55 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:01:20.603 04:44:55 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:20.603 04:44:55 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:20.603 04:44:55 -- common/autobuild_common.sh@456 -- $ get_config_params 00:01:20.603 04:44:55 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:01:20.603 04:44:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.603 04:44:55 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:01:20.603 04:44:55 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:20.603 04:44:55 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:20.603 04:44:55 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:01:20.603 04:44:55 -- spdk/autobuild.sh@16 -- $ date -u 00:01:20.603 Fri Nov 8 03:44:55 AM UTC 2024 00:01:20.603 04:44:55 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:20.603 LTS-67-gc13c99a5e 00:01:20.603 04:44:55 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:20.603 04:44:55 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:20.603 04:44:55 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:20.603 04:44:55 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:01:20.603 04:44:55 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:20.603 04:44:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.603 ************************************ 00:01:20.603 START TEST ubsan 00:01:20.603 ************************************ 00:01:20.603 04:44:55 -- common/autotest_common.sh@1114 -- $ echo 'using ubsan' 00:01:20.603 using ubsan 00:01:20.603 00:01:20.603 real 0m0.000s 00:01:20.603 user 0m0.000s 00:01:20.603 sys 0m0.000s 00:01:20.603 04:44:55 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:20.603 04:44:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.603 ************************************ 00:01:20.603 END TEST ubsan 00:01:20.603 ************************************ 00:01:20.603 04:44:55 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:20.603 04:44:55 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:20.603 04:44:55 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:20.603 04:44:55 -- spdk/autobuild.sh@51 -- $ [[ 1 -eq 1 ]] 00:01:20.603 04:44:55 -- spdk/autobuild.sh@52 -- $ llvm_precompile 00:01:20.603 04:44:55 -- common/autobuild_common.sh@428 -- $ run_test autobuild_llvm_precompile _llvm_precompile 00:01:20.603 04:44:55 -- common/autotest_common.sh@1087 -- $ '[' 2 -le 1 ']' 00:01:20.603 04:44:55 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:01:20.603 04:44:55 -- common/autotest_common.sh@10 -- $ set +x 00:01:20.603 ************************************ 00:01:20.603 START TEST autobuild_llvm_precompile 00:01:20.603 ************************************ 00:01:20.603 04:44:55 -- common/autotest_common.sh@1114 -- $ _llvm_precompile 00:01:20.603 04:44:55 -- common/autobuild_common.sh@32 -- $ clang --version 00:01:20.603 04:44:55 -- common/autobuild_common.sh@32 -- $ [[ clang version 17.0.6 (Fedora 17.0.6-2.fc39) 00:01:20.603 Target: x86_64-redhat-linux-gnu 00:01:20.603 Thread model: posix 00:01:20.603 InstalledDir: /usr/bin =~ version (([0-9]+).([0-9]+).([0-9]+)) ]] 00:01:20.603 04:44:55 -- common/autobuild_common.sh@33 -- $ clang_num=17 00:01:20.603 04:44:55 -- common/autobuild_common.sh@35 -- $ export CC=clang-17 00:01:20.603 04:44:55 -- common/autobuild_common.sh@35 -- $ CC=clang-17 00:01:20.603 04:44:55 -- common/autobuild_common.sh@36 -- $ export CXX=clang++-17 00:01:20.603 04:44:55 -- common/autobuild_common.sh@36 -- $ CXX=clang++-17 00:01:20.603 04:44:55 -- common/autobuild_common.sh@38 -- $ fuzzer_libs=(/usr/lib*/clang/@("$clang_num"|"$clang_version")/lib/*linux*/libclang_rt.fuzzer_no_main?(-x86_64).a) 00:01:20.603 04:44:55 -- common/autobuild_common.sh@39 -- $ fuzzer_lib=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:20.603 04:44:55 -- common/autobuild_common.sh@40 -- $ [[ -e /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a ]] 00:01:20.603 04:44:55 -- common/autobuild_common.sh@42 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a' 00:01:20.603 04:44:55 -- common/autobuild_common.sh@44 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:20.863 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:20.863 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:21.121 Using 'verbs' RDMA provider 00:01:37.016 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:01:49.218 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:01:49.218 Creating mk/config.mk...done. 00:01:49.218 Creating mk/cc.flags.mk...done. 00:01:49.218 Type 'make' to build. 00:01:49.218 00:01:49.218 real 0m28.651s 00:01:49.218 user 0m12.560s 00:01:49.218 sys 0m15.517s 00:01:49.218 04:45:24 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:01:49.218 04:45:24 -- common/autotest_common.sh@10 -- $ set +x 00:01:49.218 ************************************ 00:01:49.218 END TEST autobuild_llvm_precompile 00:01:49.218 ************************************ 00:01:49.218 04:45:24 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:49.218 04:45:24 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:49.218 04:45:24 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:49.218 04:45:24 -- spdk/autobuild.sh@62 -- $ [[ 1 -eq 1 ]] 00:01:49.218 04:45:24 -- spdk/autobuild.sh@64 -- $ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user --with-fuzzer=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:01:49.477 Using default SPDK env in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:01:49.477 Using default DPDK in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:01:50.044 Using 'verbs' RDMA provider 00:02:02.816 Configuring ISA-L (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l/spdk-isal.log)...done. 00:02:15.045 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/isa-l-crypto/spdk-isal-crypto.log)...done. 00:02:15.045 Creating mk/config.mk...done. 00:02:15.045 Creating mk/cc.flags.mk...done. 00:02:15.045 Type 'make' to build. 00:02:15.045 04:45:48 -- spdk/autobuild.sh@69 -- $ run_test make make -j112 00:02:15.045 04:45:48 -- common/autotest_common.sh@1087 -- $ '[' 3 -le 1 ']' 00:02:15.045 04:45:48 -- common/autotest_common.sh@1093 -- $ xtrace_disable 00:02:15.045 04:45:48 -- common/autotest_common.sh@10 -- $ set +x 00:02:15.045 ************************************ 00:02:15.045 START TEST make 00:02:15.045 ************************************ 00:02:15.045 04:45:48 -- common/autotest_common.sh@1114 -- $ make -j112 00:02:15.045 make[1]: Nothing to be done for 'all'. 00:02:15.612 The Meson build system 00:02:15.612 Version: 1.5.0 00:02:15.612 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user 00:02:15.612 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:15.612 Build type: native build 00:02:15.612 Project name: libvfio-user 00:02:15.613 Project version: 0.0.1 00:02:15.613 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:15.613 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:15.613 Host machine cpu family: x86_64 00:02:15.613 Host machine cpu: x86_64 00:02:15.613 Run-time dependency threads found: YES 00:02:15.613 Library dl found: YES 00:02:15.613 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:15.613 Run-time dependency json-c found: YES 0.17 00:02:15.613 Run-time dependency cmocka found: YES 1.1.7 00:02:15.613 Program pytest-3 found: NO 00:02:15.613 Program flake8 found: NO 00:02:15.613 Program misspell-fixer found: NO 00:02:15.613 Program restructuredtext-lint found: NO 00:02:15.613 Program valgrind found: YES (/usr/bin/valgrind) 00:02:15.613 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:15.613 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:15.613 Compiler for C supports arguments -Wwrite-strings: YES 00:02:15.613 ../libvfio-user/test/meson.build:20: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:15.613 Program test-lspci.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-lspci.sh) 00:02:15.613 Program test-linkage.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/libvfio-user/test/test-linkage.sh) 00:02:15.613 ../libvfio-user/test/py/meson.build:16: WARNING: Project targets '>= 0.53.0' but uses feature introduced in '0.57.0': exclude_suites arg in add_test_setup. 00:02:15.613 Build targets in project: 8 00:02:15.613 WARNING: Project specifies a minimum meson_version '>= 0.53.0' but uses features which were added in newer versions: 00:02:15.613 * 0.57.0: {'exclude_suites arg in add_test_setup'} 00:02:15.613 00:02:15.613 libvfio-user 0.0.1 00:02:15.613 00:02:15.613 User defined options 00:02:15.613 buildtype : debug 00:02:15.613 default_library: static 00:02:15.613 libdir : /usr/local/lib 00:02:15.613 00:02:15.613 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:16.182 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:16.182 [1/36] Compiling C object samples/lspci.p/lspci.c.o 00:02:16.182 [2/36] Compiling C object lib/libvfio-user.a.p/tran.c.o 00:02:16.182 [3/36] Compiling C object lib/libvfio-user.a.p/pci.c.o 00:02:16.182 [4/36] Compiling C object samples/shadow_ioeventfd_server.p/shadow_ioeventfd_server.c.o 00:02:16.182 [5/36] Compiling C object lib/libvfio-user.a.p/irq.c.o 00:02:16.182 [6/36] Compiling C object samples/client.p/.._lib_tran.c.o 00:02:16.182 [7/36] Compiling C object test/unit_tests.p/.._lib_tran_pipe.c.o 00:02:16.182 [8/36] Compiling C object samples/gpio-pci-idio-16.p/gpio-pci-idio-16.c.o 00:02:16.182 [9/36] Compiling C object samples/client.p/.._lib_migration.c.o 00:02:16.182 [10/36] Compiling C object lib/libvfio-user.a.p/migration.c.o 00:02:16.182 [11/36] Compiling C object samples/null.p/null.c.o 00:02:16.182 [12/36] Compiling C object test/unit_tests.p/.._lib_irq.c.o 00:02:16.182 [13/36] Compiling C object test/unit_tests.p/.._lib_tran.c.o 00:02:16.182 [14/36] Compiling C object test/unit_tests.p/mocks.c.o 00:02:16.182 [15/36] Compiling C object test/unit_tests.p/.._lib_pci.c.o 00:02:16.182 [16/36] Compiling C object test/unit_tests.p/.._lib_migration.c.o 00:02:16.182 [17/36] Compiling C object lib/libvfio-user.a.p/dma.c.o 00:02:16.182 [18/36] Compiling C object lib/libvfio-user.a.p/pci_caps.c.o 00:02:16.182 [19/36] Compiling C object lib/libvfio-user.a.p/tran_sock.c.o 00:02:16.182 [20/36] Compiling C object samples/server.p/server.c.o 00:02:16.182 [21/36] Compiling C object samples/client.p/.._lib_tran_sock.c.o 00:02:16.182 [22/36] Compiling C object test/unit_tests.p/.._lib_dma.c.o 00:02:16.182 [23/36] Compiling C object test/unit_tests.p/.._lib_tran_sock.c.o 00:02:16.182 [24/36] Compiling C object test/unit_tests.p/.._lib_pci_caps.c.o 00:02:16.182 [25/36] Compiling C object test/unit_tests.p/unit-tests.c.o 00:02:16.182 [26/36] Compiling C object samples/client.p/client.c.o 00:02:16.182 [27/36] Compiling C object lib/libvfio-user.a.p/libvfio-user.c.o 00:02:16.182 [28/36] Compiling C object test/unit_tests.p/.._lib_libvfio-user.c.o 00:02:16.182 [29/36] Linking target samples/client 00:02:16.182 [30/36] Linking static target lib/libvfio-user.a 00:02:16.182 [31/36] Linking target samples/shadow_ioeventfd_server 00:02:16.182 [32/36] Linking target test/unit_tests 00:02:16.182 [33/36] Linking target samples/null 00:02:16.182 [34/36] Linking target samples/gpio-pci-idio-16 00:02:16.182 [35/36] Linking target samples/server 00:02:16.182 [36/36] Linking target samples/lspci 00:02:16.182 INFO: autodetecting backend as ninja 00:02:16.182 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:16.442 DESTDIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user meson install --quiet -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug 00:02:16.701 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/build-debug' 00:02:16.701 ninja: no work to do. 00:02:21.976 The Meson build system 00:02:21.976 Version: 1.5.0 00:02:21.976 Source dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk 00:02:21.976 Build dir: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp 00:02:21.976 Build type: native build 00:02:21.976 Program cat found: YES (/usr/bin/cat) 00:02:21.976 Project name: DPDK 00:02:21.976 Project version: 23.11.0 00:02:21.976 C compiler for the host machine: clang-17 (clang 17.0.6 "clang version 17.0.6 (Fedora 17.0.6-2.fc39)") 00:02:21.976 C linker for the host machine: clang-17 ld.bfd 2.40-14 00:02:21.976 Host machine cpu family: x86_64 00:02:21.976 Host machine cpu: x86_64 00:02:21.976 Message: ## Building in Developer Mode ## 00:02:21.976 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:21.976 Program check-symbols.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:21.976 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:21.976 Program python3 found: YES (/usr/bin/python3) 00:02:21.976 Program cat found: YES (/usr/bin/cat) 00:02:21.976 Compiler for C supports arguments -march=native: YES 00:02:21.976 Checking for size of "void *" : 8 00:02:21.976 Checking for size of "void *" : 8 (cached) 00:02:21.976 Library m found: YES 00:02:21.976 Library numa found: YES 00:02:21.976 Has header "numaif.h" : YES 00:02:21.976 Library fdt found: NO 00:02:21.976 Library execinfo found: NO 00:02:21.976 Has header "execinfo.h" : YES 00:02:21.976 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:21.976 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:21.976 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:21.976 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:21.976 Run-time dependency openssl found: YES 3.1.1 00:02:21.976 Run-time dependency libpcap found: YES 1.10.4 00:02:21.976 Has header "pcap.h" with dependency libpcap: YES 00:02:21.976 Compiler for C supports arguments -Wcast-qual: YES 00:02:21.976 Compiler for C supports arguments -Wdeprecated: YES 00:02:21.976 Compiler for C supports arguments -Wformat: YES 00:02:21.976 Compiler for C supports arguments -Wformat-nonliteral: YES 00:02:21.976 Compiler for C supports arguments -Wformat-security: YES 00:02:21.976 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:21.976 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:21.976 Compiler for C supports arguments -Wnested-externs: YES 00:02:21.976 Compiler for C supports arguments -Wold-style-definition: YES 00:02:21.976 Compiler for C supports arguments -Wpointer-arith: YES 00:02:21.976 Compiler for C supports arguments -Wsign-compare: YES 00:02:21.976 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:21.976 Compiler for C supports arguments -Wundef: YES 00:02:21.976 Compiler for C supports arguments -Wwrite-strings: YES 00:02:21.976 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:21.976 Compiler for C supports arguments -Wno-packed-not-aligned: NO 00:02:21.976 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:21.976 Program objdump found: YES (/usr/bin/objdump) 00:02:21.976 Compiler for C supports arguments -mavx512f: YES 00:02:21.976 Checking if "AVX512 checking" compiles: YES 00:02:21.976 Fetching value of define "__SSE4_2__" : 1 00:02:21.976 Fetching value of define "__AES__" : 1 00:02:21.976 Fetching value of define "__AVX__" : 1 00:02:21.976 Fetching value of define "__AVX2__" : 1 00:02:21.976 Fetching value of define "__AVX512BW__" : 1 00:02:21.976 Fetching value of define "__AVX512CD__" : 1 00:02:21.976 Fetching value of define "__AVX512DQ__" : 1 00:02:21.976 Fetching value of define "__AVX512F__" : 1 00:02:21.976 Fetching value of define "__AVX512VL__" : 1 00:02:21.976 Fetching value of define "__PCLMUL__" : 1 00:02:21.976 Fetching value of define "__RDRND__" : 1 00:02:21.976 Fetching value of define "__RDSEED__" : 1 00:02:21.976 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:21.976 Fetching value of define "__znver1__" : (undefined) 00:02:21.976 Fetching value of define "__znver2__" : (undefined) 00:02:21.976 Fetching value of define "__znver3__" : (undefined) 00:02:21.976 Fetching value of define "__znver4__" : (undefined) 00:02:21.976 Compiler for C supports arguments -Wno-format-truncation: NO 00:02:21.976 Message: lib/log: Defining dependency "log" 00:02:21.976 Message: lib/kvargs: Defining dependency "kvargs" 00:02:21.976 Message: lib/telemetry: Defining dependency "telemetry" 00:02:21.976 Checking for function "getentropy" : NO 00:02:21.976 Message: lib/eal: Defining dependency "eal" 00:02:21.976 Message: lib/ring: Defining dependency "ring" 00:02:21.976 Message: lib/rcu: Defining dependency "rcu" 00:02:21.976 Message: lib/mempool: Defining dependency "mempool" 00:02:21.976 Message: lib/mbuf: Defining dependency "mbuf" 00:02:21.976 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:21.976 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:21.976 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:21.976 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:21.976 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:21.976 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:21.976 Compiler for C supports arguments -mpclmul: YES 00:02:21.976 Compiler for C supports arguments -maes: YES 00:02:21.976 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:21.976 Compiler for C supports arguments -mavx512bw: YES 00:02:21.976 Compiler for C supports arguments -mavx512dq: YES 00:02:21.976 Compiler for C supports arguments -mavx512vl: YES 00:02:21.976 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:21.976 Compiler for C supports arguments -mavx2: YES 00:02:21.976 Compiler for C supports arguments -mavx: YES 00:02:21.976 Message: lib/net: Defining dependency "net" 00:02:21.976 Message: lib/meter: Defining dependency "meter" 00:02:21.976 Message: lib/ethdev: Defining dependency "ethdev" 00:02:21.976 Message: lib/pci: Defining dependency "pci" 00:02:21.976 Message: lib/cmdline: Defining dependency "cmdline" 00:02:21.976 Message: lib/hash: Defining dependency "hash" 00:02:21.976 Message: lib/timer: Defining dependency "timer" 00:02:21.976 Message: lib/compressdev: Defining dependency "compressdev" 00:02:21.976 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:21.976 Message: lib/dmadev: Defining dependency "dmadev" 00:02:21.976 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:21.976 Message: lib/power: Defining dependency "power" 00:02:21.976 Message: lib/reorder: Defining dependency "reorder" 00:02:21.976 Message: lib/security: Defining dependency "security" 00:02:21.976 Has header "linux/userfaultfd.h" : YES 00:02:21.976 Has header "linux/vduse.h" : YES 00:02:21.976 Message: lib/vhost: Defining dependency "vhost" 00:02:21.976 Compiler for C supports arguments -Wno-format-truncation: NO (cached) 00:02:21.976 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:21.976 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:21.976 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:21.976 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:21.976 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:21.976 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:21.976 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:21.976 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:21.976 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:21.976 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:21.976 Configuring doxy-api-html.conf using configuration 00:02:21.977 Configuring doxy-api-man.conf using configuration 00:02:21.977 Program mandb found: YES (/usr/bin/mandb) 00:02:21.977 Program sphinx-build found: NO 00:02:21.977 Configuring rte_build_config.h using configuration 00:02:21.977 Message: 00:02:21.977 ================= 00:02:21.977 Applications Enabled 00:02:21.977 ================= 00:02:21.977 00:02:21.977 apps: 00:02:21.977 00:02:21.977 00:02:21.977 Message: 00:02:21.977 ================= 00:02:21.977 Libraries Enabled 00:02:21.977 ================= 00:02:21.977 00:02:21.977 libs: 00:02:21.977 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:21.977 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:21.977 cryptodev, dmadev, power, reorder, security, vhost, 00:02:21.977 00:02:21.977 Message: 00:02:21.977 =============== 00:02:21.977 Drivers Enabled 00:02:21.977 =============== 00:02:21.977 00:02:21.977 common: 00:02:21.977 00:02:21.977 bus: 00:02:21.977 pci, vdev, 00:02:21.977 mempool: 00:02:21.977 ring, 00:02:21.977 dma: 00:02:21.977 00:02:21.977 net: 00:02:21.977 00:02:21.977 crypto: 00:02:21.977 00:02:21.977 compress: 00:02:21.977 00:02:21.977 vdpa: 00:02:21.977 00:02:21.977 00:02:21.977 Message: 00:02:21.977 ================= 00:02:21.977 Content Skipped 00:02:21.977 ================= 00:02:21.977 00:02:21.977 apps: 00:02:21.977 dumpcap: explicitly disabled via build config 00:02:21.977 graph: explicitly disabled via build config 00:02:21.977 pdump: explicitly disabled via build config 00:02:21.977 proc-info: explicitly disabled via build config 00:02:21.977 test-acl: explicitly disabled via build config 00:02:21.977 test-bbdev: explicitly disabled via build config 00:02:21.977 test-cmdline: explicitly disabled via build config 00:02:21.977 test-compress-perf: explicitly disabled via build config 00:02:21.977 test-crypto-perf: explicitly disabled via build config 00:02:21.977 test-dma-perf: explicitly disabled via build config 00:02:21.977 test-eventdev: explicitly disabled via build config 00:02:21.977 test-fib: explicitly disabled via build config 00:02:21.977 test-flow-perf: explicitly disabled via build config 00:02:21.977 test-gpudev: explicitly disabled via build config 00:02:21.977 test-mldev: explicitly disabled via build config 00:02:21.977 test-pipeline: explicitly disabled via build config 00:02:21.977 test-pmd: explicitly disabled via build config 00:02:21.977 test-regex: explicitly disabled via build config 00:02:21.977 test-sad: explicitly disabled via build config 00:02:21.977 test-security-perf: explicitly disabled via build config 00:02:21.977 00:02:21.977 libs: 00:02:21.977 metrics: explicitly disabled via build config 00:02:21.977 acl: explicitly disabled via build config 00:02:21.977 bbdev: explicitly disabled via build config 00:02:21.977 bitratestats: explicitly disabled via build config 00:02:21.977 bpf: explicitly disabled via build config 00:02:21.977 cfgfile: explicitly disabled via build config 00:02:21.977 distributor: explicitly disabled via build config 00:02:21.977 efd: explicitly disabled via build config 00:02:21.977 eventdev: explicitly disabled via build config 00:02:21.977 dispatcher: explicitly disabled via build config 00:02:21.977 gpudev: explicitly disabled via build config 00:02:21.977 gro: explicitly disabled via build config 00:02:21.977 gso: explicitly disabled via build config 00:02:21.977 ip_frag: explicitly disabled via build config 00:02:21.977 jobstats: explicitly disabled via build config 00:02:21.977 latencystats: explicitly disabled via build config 00:02:21.977 lpm: explicitly disabled via build config 00:02:21.977 member: explicitly disabled via build config 00:02:21.977 pcapng: explicitly disabled via build config 00:02:21.977 rawdev: explicitly disabled via build config 00:02:21.977 regexdev: explicitly disabled via build config 00:02:21.977 mldev: explicitly disabled via build config 00:02:21.977 rib: explicitly disabled via build config 00:02:21.977 sched: explicitly disabled via build config 00:02:21.977 stack: explicitly disabled via build config 00:02:21.977 ipsec: explicitly disabled via build config 00:02:21.977 pdcp: explicitly disabled via build config 00:02:21.977 fib: explicitly disabled via build config 00:02:21.977 port: explicitly disabled via build config 00:02:21.977 pdump: explicitly disabled via build config 00:02:21.977 table: explicitly disabled via build config 00:02:21.977 pipeline: explicitly disabled via build config 00:02:21.977 graph: explicitly disabled via build config 00:02:21.977 node: explicitly disabled via build config 00:02:21.977 00:02:21.977 drivers: 00:02:21.977 common/cpt: not in enabled drivers build config 00:02:21.977 common/dpaax: not in enabled drivers build config 00:02:21.977 common/iavf: not in enabled drivers build config 00:02:21.977 common/idpf: not in enabled drivers build config 00:02:21.977 common/mvep: not in enabled drivers build config 00:02:21.977 common/octeontx: not in enabled drivers build config 00:02:21.977 bus/auxiliary: not in enabled drivers build config 00:02:21.977 bus/cdx: not in enabled drivers build config 00:02:21.977 bus/dpaa: not in enabled drivers build config 00:02:21.977 bus/fslmc: not in enabled drivers build config 00:02:21.977 bus/ifpga: not in enabled drivers build config 00:02:21.977 bus/platform: not in enabled drivers build config 00:02:21.977 bus/vmbus: not in enabled drivers build config 00:02:21.977 common/cnxk: not in enabled drivers build config 00:02:21.977 common/mlx5: not in enabled drivers build config 00:02:21.977 common/nfp: not in enabled drivers build config 00:02:21.977 common/qat: not in enabled drivers build config 00:02:21.977 common/sfc_efx: not in enabled drivers build config 00:02:21.977 mempool/bucket: not in enabled drivers build config 00:02:21.977 mempool/cnxk: not in enabled drivers build config 00:02:21.977 mempool/dpaa: not in enabled drivers build config 00:02:21.977 mempool/dpaa2: not in enabled drivers build config 00:02:21.977 mempool/octeontx: not in enabled drivers build config 00:02:21.977 mempool/stack: not in enabled drivers build config 00:02:21.977 dma/cnxk: not in enabled drivers build config 00:02:21.977 dma/dpaa: not in enabled drivers build config 00:02:21.977 dma/dpaa2: not in enabled drivers build config 00:02:21.977 dma/hisilicon: not in enabled drivers build config 00:02:21.977 dma/idxd: not in enabled drivers build config 00:02:21.977 dma/ioat: not in enabled drivers build config 00:02:21.977 dma/skeleton: not in enabled drivers build config 00:02:21.977 net/af_packet: not in enabled drivers build config 00:02:21.977 net/af_xdp: not in enabled drivers build config 00:02:21.977 net/ark: not in enabled drivers build config 00:02:21.977 net/atlantic: not in enabled drivers build config 00:02:21.977 net/avp: not in enabled drivers build config 00:02:21.977 net/axgbe: not in enabled drivers build config 00:02:21.977 net/bnx2x: not in enabled drivers build config 00:02:21.977 net/bnxt: not in enabled drivers build config 00:02:21.977 net/bonding: not in enabled drivers build config 00:02:21.977 net/cnxk: not in enabled drivers build config 00:02:21.977 net/cpfl: not in enabled drivers build config 00:02:21.977 net/cxgbe: not in enabled drivers build config 00:02:21.977 net/dpaa: not in enabled drivers build config 00:02:21.977 net/dpaa2: not in enabled drivers build config 00:02:21.977 net/e1000: not in enabled drivers build config 00:02:21.977 net/ena: not in enabled drivers build config 00:02:21.977 net/enetc: not in enabled drivers build config 00:02:21.977 net/enetfec: not in enabled drivers build config 00:02:21.977 net/enic: not in enabled drivers build config 00:02:21.977 net/failsafe: not in enabled drivers build config 00:02:21.977 net/fm10k: not in enabled drivers build config 00:02:21.977 net/gve: not in enabled drivers build config 00:02:21.977 net/hinic: not in enabled drivers build config 00:02:21.977 net/hns3: not in enabled drivers build config 00:02:21.977 net/i40e: not in enabled drivers build config 00:02:21.977 net/iavf: not in enabled drivers build config 00:02:21.977 net/ice: not in enabled drivers build config 00:02:21.977 net/idpf: not in enabled drivers build config 00:02:21.977 net/igc: not in enabled drivers build config 00:02:21.977 net/ionic: not in enabled drivers build config 00:02:21.977 net/ipn3ke: not in enabled drivers build config 00:02:21.977 net/ixgbe: not in enabled drivers build config 00:02:21.977 net/mana: not in enabled drivers build config 00:02:21.977 net/memif: not in enabled drivers build config 00:02:21.977 net/mlx4: not in enabled drivers build config 00:02:21.977 net/mlx5: not in enabled drivers build config 00:02:21.977 net/mvneta: not in enabled drivers build config 00:02:21.977 net/mvpp2: not in enabled drivers build config 00:02:21.977 net/netvsc: not in enabled drivers build config 00:02:21.977 net/nfb: not in enabled drivers build config 00:02:21.977 net/nfp: not in enabled drivers build config 00:02:21.977 net/ngbe: not in enabled drivers build config 00:02:21.977 net/null: not in enabled drivers build config 00:02:21.977 net/octeontx: not in enabled drivers build config 00:02:21.977 net/octeon_ep: not in enabled drivers build config 00:02:21.977 net/pcap: not in enabled drivers build config 00:02:21.977 net/pfe: not in enabled drivers build config 00:02:21.977 net/qede: not in enabled drivers build config 00:02:21.977 net/ring: not in enabled drivers build config 00:02:21.977 net/sfc: not in enabled drivers build config 00:02:21.977 net/softnic: not in enabled drivers build config 00:02:21.977 net/tap: not in enabled drivers build config 00:02:21.977 net/thunderx: not in enabled drivers build config 00:02:21.977 net/txgbe: not in enabled drivers build config 00:02:21.977 net/vdev_netvsc: not in enabled drivers build config 00:02:21.977 net/vhost: not in enabled drivers build config 00:02:21.977 net/virtio: not in enabled drivers build config 00:02:21.977 net/vmxnet3: not in enabled drivers build config 00:02:21.977 raw/*: missing internal dependency, "rawdev" 00:02:21.977 crypto/armv8: not in enabled drivers build config 00:02:21.977 crypto/bcmfs: not in enabled drivers build config 00:02:21.977 crypto/caam_jr: not in enabled drivers build config 00:02:21.977 crypto/ccp: not in enabled drivers build config 00:02:21.977 crypto/cnxk: not in enabled drivers build config 00:02:21.977 crypto/dpaa_sec: not in enabled drivers build config 00:02:21.977 crypto/dpaa2_sec: not in enabled drivers build config 00:02:21.977 crypto/ipsec_mb: not in enabled drivers build config 00:02:21.977 crypto/mlx5: not in enabled drivers build config 00:02:21.978 crypto/mvsam: not in enabled drivers build config 00:02:21.978 crypto/nitrox: not in enabled drivers build config 00:02:21.978 crypto/null: not in enabled drivers build config 00:02:21.978 crypto/octeontx: not in enabled drivers build config 00:02:21.978 crypto/openssl: not in enabled drivers build config 00:02:21.978 crypto/scheduler: not in enabled drivers build config 00:02:21.978 crypto/uadk: not in enabled drivers build config 00:02:21.978 crypto/virtio: not in enabled drivers build config 00:02:21.978 compress/isal: not in enabled drivers build config 00:02:21.978 compress/mlx5: not in enabled drivers build config 00:02:21.978 compress/octeontx: not in enabled drivers build config 00:02:21.978 compress/zlib: not in enabled drivers build config 00:02:21.978 regex/*: missing internal dependency, "regexdev" 00:02:21.978 ml/*: missing internal dependency, "mldev" 00:02:21.978 vdpa/ifc: not in enabled drivers build config 00:02:21.978 vdpa/mlx5: not in enabled drivers build config 00:02:21.978 vdpa/nfp: not in enabled drivers build config 00:02:21.978 vdpa/sfc: not in enabled drivers build config 00:02:21.978 event/*: missing internal dependency, "eventdev" 00:02:21.978 baseband/*: missing internal dependency, "bbdev" 00:02:21.978 gpu/*: missing internal dependency, "gpudev" 00:02:21.978 00:02:21.978 00:02:21.978 Build targets in project: 85 00:02:21.978 00:02:21.978 DPDK 23.11.0 00:02:21.978 00:02:21.978 User defined options 00:02:21.978 buildtype : debug 00:02:21.978 default_library : static 00:02:21.978 libdir : lib 00:02:21.978 prefix : /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:02:21.978 c_args : -fPIC -Werror 00:02:21.978 c_link_args : 00:02:21.978 cpu_instruction_set: native 00:02:21.978 disable_apps : test-fib,test-sad,test,test-regex,test-security-perf,test-bbdev,dumpcap,test-crypto-perf,test-flow-perf,test-gpudev,test-cmdline,test-dma-perf,test-eventdev,test-pipeline,test-acl,proc-info,test-compress-perf,graph,test-pmd,test-mldev,pdump 00:02:21.978 disable_libs : bbdev,latencystats,member,gpudev,mldev,pipeline,lpm,efd,regexdev,sched,node,dispatcher,table,bpf,port,gro,fib,cfgfile,ip_frag,gso,rawdev,ipsec,pdcp,rib,acl,metrics,graph,pcapng,jobstats,eventdev,stack,bitratestats,distributor,pdump 00:02:21.978 enable_docs : false 00:02:21.978 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring 00:02:21.978 enable_kmods : false 00:02:21.978 tests : false 00:02:21.978 00:02:21.978 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:22.243 ninja: Entering directory `/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp' 00:02:22.243 [1/265] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:22.243 [2/265] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:22.243 [3/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:22.243 [4/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:22.243 [5/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:22.243 [6/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:22.243 [7/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:22.243 [8/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:22.243 [9/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:22.243 [10/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:22.243 [11/265] Linking static target lib/librte_kvargs.a 00:02:22.243 [12/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:22.243 [13/265] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:22.243 [14/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:22.243 [15/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:22.243 [16/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:22.243 [17/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:22.243 [18/265] Linking static target lib/librte_log.a 00:02:22.243 [19/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:22.243 [20/265] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:22.243 [21/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:22.243 [22/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:22.243 [23/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:22.243 [24/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:22.506 [25/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:22.506 [26/265] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:22.506 [27/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:22.506 [28/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:22.506 [29/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:22.506 [30/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:22.506 [31/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:22.506 [32/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:22.506 [33/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:22.506 [34/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:22.506 [35/265] Linking static target lib/librte_pci.a 00:02:22.506 [36/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:22.506 [37/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:22.506 [38/265] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:22.506 [39/265] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:22.506 [40/265] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:22.506 [41/265] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:22.764 [42/265] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.764 [43/265] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.764 [44/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:22.764 [45/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:22.764 [46/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:22.764 [47/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:22.764 [48/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:22.764 [49/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:22.764 [50/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:22.764 [51/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:22.764 [52/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:22.764 [53/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:22.764 [54/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:22.764 [55/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:22.764 [56/265] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:22.764 [57/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:22.764 [58/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:22.764 [59/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:22.764 [60/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:22.764 [61/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:22.764 [62/265] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:22.764 [63/265] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:22.764 [64/265] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:22.764 [65/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:22.764 [66/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:22.765 [67/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:22.765 [68/265] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:22.765 [69/265] Linking static target lib/librte_telemetry.a 00:02:22.765 [70/265] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:22.765 [71/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:22.765 [72/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:22.765 [73/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:22.765 [74/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:22.765 [75/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:22.765 [76/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:22.765 [77/265] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:22.765 [78/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:22.765 [79/265] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:22.765 [80/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:22.765 [81/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:22.765 [82/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:22.765 [83/265] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:22.765 [84/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:22.765 [85/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:22.765 [86/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:22.765 [87/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:22.765 [88/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:22.765 [89/265] Linking static target lib/librte_meter.a 00:02:23.024 [90/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:23.024 [91/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:23.024 [92/265] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:23.024 [93/265] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:23.024 [94/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:23.024 [95/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:23.024 [96/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:23.024 [97/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:23.024 [98/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:23.024 [99/265] Linking static target lib/librte_ring.a 00:02:23.024 [100/265] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:23.024 [101/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:23.024 [102/265] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:23.024 [103/265] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:23.024 [104/265] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:23.024 [105/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:23.024 [106/265] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:23.024 [107/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:23.024 [108/265] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:23.024 [109/265] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:23.024 [110/265] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:23.024 [111/265] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:23.024 [112/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:23.024 [113/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:23.024 [114/265] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:23.024 [115/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:23.024 [116/265] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:23.024 [117/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:23.024 [118/265] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:23.024 [119/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:23.024 [120/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:23.024 [121/265] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:23.024 [122/265] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.024 [123/265] Linking static target lib/librte_cmdline.a 00:02:23.024 [124/265] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:23.024 [125/265] Linking static target lib/librte_timer.a 00:02:23.024 [126/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.024 [127/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:23.024 [128/265] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:23.024 [129/265] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:23.024 [130/265] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:23.024 [131/265] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:23.024 [132/265] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:23.024 [133/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:23.024 [134/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:23.024 [135/265] Linking static target lib/librte_mempool.a 00:02:23.024 [136/265] Linking static target lib/librte_net.a 00:02:23.024 [137/265] Linking static target lib/librte_dmadev.a 00:02:23.024 [138/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:23.024 [139/265] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:23.024 [140/265] Linking static target lib/librte_eal.a 00:02:23.024 [141/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:23.024 [142/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:23.024 [143/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:23.024 [144/265] Linking target lib/librte_log.so.24.0 00:02:23.024 [145/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:23.024 [146/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:23.024 [147/265] Linking static target lib/librte_compressdev.a 00:02:23.024 [148/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.024 [149/265] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.024 [150/265] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:23.024 [151/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:23.024 [152/265] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:23.024 [153/265] Linking static target lib/librte_rcu.a 00:02:23.024 [154/265] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:23.024 [155/265] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:23.024 [156/265] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:23.024 [157/265] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:23.024 [158/265] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:23.024 [159/265] Linking static target lib/librte_reorder.a 00:02:23.024 [160/265] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:23.024 [161/265] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:23.024 [162/265] Linking static target lib/librte_power.a 00:02:23.024 [163/265] Linking static target lib/librte_mbuf.a 00:02:23.024 [164/265] Linking static target lib/librte_security.a 00:02:23.024 [165/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:23.283 [166/265] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:23.283 [167/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:23.283 [168/265] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:23.283 [169/265] Linking target lib/librte_kvargs.so.24.0 00:02:23.283 [170/265] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.283 [171/265] Linking static target lib/librte_hash.a 00:02:23.283 [172/265] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:23.283 [173/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:23.283 [174/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.283 [175/265] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:23.283 [176/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:23.283 [177/265] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.283 [178/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:23.283 [179/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:23.283 [180/265] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:23.283 [181/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:23.283 [182/265] Linking static target lib/librte_cryptodev.a 00:02:23.283 [183/265] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:23.283 [184/265] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:23.283 [185/265] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.283 [186/265] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:23.283 [187/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:23.283 [188/265] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:23.284 [189/265] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:23.284 [190/265] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:23.284 [191/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:23.284 [192/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:23.284 [193/265] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:23.542 [194/265] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:23.542 [195/265] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.542 [196/265] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:23.542 [197/265] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:23.542 [198/265] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.542 [199/265] Linking static target drivers/librte_bus_vdev.a 00:02:23.542 [200/265] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.542 [201/265] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.542 [202/265] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:23.542 [203/265] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:23.542 [204/265] Linking target lib/librte_telemetry.so.24.0 00:02:23.542 [205/265] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:23.542 [206/265] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.542 [207/265] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:23.542 [208/265] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:23.542 [209/265] Linking static target lib/librte_ethdev.a 00:02:23.542 [210/265] Linking static target drivers/librte_mempool_ring.a 00:02:23.542 [211/265] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.543 [212/265] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:23.543 [213/265] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:23.543 [214/265] Linking static target drivers/librte_bus_pci.a 00:02:23.543 [215/265] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:23.801 [216/265] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.801 [217/265] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.801 [218/265] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.801 [219/265] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.059 [220/265] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.059 [221/265] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.059 [222/265] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.317 [223/265] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:24.317 [224/265] Linking static target lib/librte_vhost.a 00:02:24.317 [225/265] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.575 [226/265] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.509 [227/265] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.444 [228/265] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.009 [229/265] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.381 [230/265] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.381 [231/265] Linking target lib/librte_eal.so.24.0 00:02:36.381 [232/265] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:36.381 [233/265] Linking target lib/librte_pci.so.24.0 00:02:36.381 [234/265] Linking target lib/librte_meter.so.24.0 00:02:36.381 [235/265] Linking target lib/librte_timer.so.24.0 00:02:36.381 [236/265] Linking target lib/librte_ring.so.24.0 00:02:36.381 [237/265] Linking target lib/librte_dmadev.so.24.0 00:02:36.381 [238/265] Linking target drivers/librte_bus_vdev.so.24.0 00:02:36.381 [239/265] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:36.381 [240/265] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:36.381 [241/265] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:36.381 [242/265] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:36.381 [243/265] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:36.382 [244/265] Linking target drivers/librte_bus_pci.so.24.0 00:02:36.382 [245/265] Linking target lib/librte_mempool.so.24.0 00:02:36.382 [246/265] Linking target lib/librte_rcu.so.24.0 00:02:36.641 [247/265] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:36.641 [248/265] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:36.641 [249/265] Linking target drivers/librte_mempool_ring.so.24.0 00:02:36.641 [250/265] Linking target lib/librte_mbuf.so.24.0 00:02:36.641 [251/265] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:36.899 [252/265] Linking target lib/librte_reorder.so.24.0 00:02:36.899 [253/265] Linking target lib/librte_net.so.24.0 00:02:36.899 [254/265] Linking target lib/librte_compressdev.so.24.0 00:02:36.899 [255/265] Linking target lib/librte_cryptodev.so.24.0 00:02:36.899 [256/265] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:36.899 [257/265] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:36.899 [258/265] Linking target lib/librte_cmdline.so.24.0 00:02:36.899 [259/265] Linking target lib/librte_hash.so.24.0 00:02:36.899 [260/265] Linking target lib/librte_security.so.24.0 00:02:36.899 [261/265] Linking target lib/librte_ethdev.so.24.0 00:02:37.159 [262/265] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:37.159 [263/265] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:37.159 [264/265] Linking target lib/librte_vhost.so.24.0 00:02:37.159 [265/265] Linking target lib/librte_power.so.24.0 00:02:37.159 INFO: autodetecting backend as ninja 00:02:37.159 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build-tmp -j 112 00:02:38.096 CC lib/ut/ut.o 00:02:38.096 CC lib/log/log.o 00:02:38.096 CC lib/log/log_deprecated.o 00:02:38.096 CC lib/log/log_flags.o 00:02:38.096 CC lib/ut_mock/mock.o 00:02:38.355 LIB libspdk_ut.a 00:02:38.355 LIB libspdk_ut_mock.a 00:02:38.355 LIB libspdk_log.a 00:02:38.614 CXX lib/trace_parser/trace.o 00:02:38.614 CC lib/util/cpuset.o 00:02:38.614 CC lib/util/base64.o 00:02:38.614 CC lib/util/bit_array.o 00:02:38.614 CC lib/util/crc32.o 00:02:38.614 CC lib/util/crc16.o 00:02:38.614 CC lib/util/crc32c.o 00:02:38.614 CC lib/util/crc64.o 00:02:38.614 CC lib/util/crc32_ieee.o 00:02:38.614 CC lib/dma/dma.o 00:02:38.614 CC lib/util/dif.o 00:02:38.614 CC lib/ioat/ioat.o 00:02:38.614 CC lib/util/fd.o 00:02:38.614 CC lib/util/file.o 00:02:38.614 CC lib/util/hexlify.o 00:02:38.614 CC lib/util/iov.o 00:02:38.614 CC lib/util/math.o 00:02:38.614 CC lib/util/pipe.o 00:02:38.614 CC lib/util/strerror_tls.o 00:02:38.614 CC lib/util/string.o 00:02:38.614 CC lib/util/uuid.o 00:02:38.614 CC lib/util/fd_group.o 00:02:38.614 CC lib/util/xor.o 00:02:38.614 CC lib/util/zipf.o 00:02:38.614 CC lib/vfio_user/host/vfio_user_pci.o 00:02:38.614 CC lib/vfio_user/host/vfio_user.o 00:02:38.873 LIB libspdk_dma.a 00:02:38.873 LIB libspdk_ioat.a 00:02:38.873 LIB libspdk_vfio_user.a 00:02:38.873 LIB libspdk_util.a 00:02:39.131 LIB libspdk_trace_parser.a 00:02:39.131 CC lib/env_dpdk/env.o 00:02:39.131 CC lib/env_dpdk/memory.o 00:02:39.131 CC lib/vmd/led.o 00:02:39.131 CC lib/env_dpdk/pci.o 00:02:39.131 CC lib/vmd/vmd.o 00:02:39.131 CC lib/env_dpdk/threads.o 00:02:39.131 CC lib/env_dpdk/init.o 00:02:39.131 CC lib/rdma/rdma_verbs.o 00:02:39.131 CC lib/env_dpdk/pci_ioat.o 00:02:39.131 CC lib/rdma/common.o 00:02:39.131 CC lib/env_dpdk/pci_idxd.o 00:02:39.131 CC lib/env_dpdk/pci_virtio.o 00:02:39.131 CC lib/env_dpdk/pci_vmd.o 00:02:39.131 CC lib/env_dpdk/pci_event.o 00:02:39.131 CC lib/env_dpdk/sigbus_handler.o 00:02:39.131 CC lib/env_dpdk/pci_dpdk.o 00:02:39.131 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:39.131 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:39.131 CC lib/json/json_parse.o 00:02:39.131 CC lib/idxd/idxd.o 00:02:39.131 CC lib/conf/conf.o 00:02:39.131 CC lib/json/json_util.o 00:02:39.131 CC lib/idxd/idxd_user.o 00:02:39.131 CC lib/json/json_write.o 00:02:39.131 CC lib/idxd/idxd_kernel.o 00:02:39.389 LIB libspdk_conf.a 00:02:39.389 LIB libspdk_rdma.a 00:02:39.389 LIB libspdk_json.a 00:02:39.647 LIB libspdk_idxd.a 00:02:39.647 LIB libspdk_vmd.a 00:02:39.647 CC lib/jsonrpc/jsonrpc_server.o 00:02:39.647 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:39.647 CC lib/jsonrpc/jsonrpc_client.o 00:02:39.647 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:39.905 LIB libspdk_jsonrpc.a 00:02:40.163 LIB libspdk_env_dpdk.a 00:02:40.163 CC lib/rpc/rpc.o 00:02:40.421 LIB libspdk_rpc.a 00:02:40.680 CC lib/sock/sock.o 00:02:40.680 CC lib/sock/sock_rpc.o 00:02:40.680 CC lib/notify/notify_rpc.o 00:02:40.680 CC lib/trace/trace.o 00:02:40.680 CC lib/notify/notify.o 00:02:40.680 CC lib/trace/trace_flags.o 00:02:40.680 CC lib/trace/trace_rpc.o 00:02:40.680 LIB libspdk_notify.a 00:02:40.680 LIB libspdk_trace.a 00:02:40.939 LIB libspdk_sock.a 00:02:40.939 CC lib/thread/thread.o 00:02:40.939 CC lib/thread/iobuf.o 00:02:41.197 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:41.197 CC lib/nvme/nvme_ctrlr.o 00:02:41.197 CC lib/nvme/nvme_ns.o 00:02:41.197 CC lib/nvme/nvme_fabric.o 00:02:41.197 CC lib/nvme/nvme_pcie_common.o 00:02:41.197 CC lib/nvme/nvme_ns_cmd.o 00:02:41.197 CC lib/nvme/nvme.o 00:02:41.197 CC lib/nvme/nvme_pcie.o 00:02:41.197 CC lib/nvme/nvme_transport.o 00:02:41.197 CC lib/nvme/nvme_qpair.o 00:02:41.197 CC lib/nvme/nvme_quirks.o 00:02:41.197 CC lib/nvme/nvme_discovery.o 00:02:41.197 CC lib/nvme/nvme_tcp.o 00:02:41.197 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:41.197 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:41.197 CC lib/nvme/nvme_opal.o 00:02:41.197 CC lib/nvme/nvme_io_msg.o 00:02:41.197 CC lib/nvme/nvme_poll_group.o 00:02:41.197 CC lib/nvme/nvme_zns.o 00:02:41.197 CC lib/nvme/nvme_cuse.o 00:02:41.197 CC lib/nvme/nvme_vfio_user.o 00:02:41.197 CC lib/nvme/nvme_rdma.o 00:02:41.764 LIB libspdk_thread.a 00:02:42.023 CC lib/blob/blobstore.o 00:02:42.023 CC lib/blob/request.o 00:02:42.023 CC lib/virtio/virtio.o 00:02:42.023 CC lib/virtio/virtio_vhost_user.o 00:02:42.023 CC lib/blob/zeroes.o 00:02:42.023 CC lib/virtio/virtio_vfio_user.o 00:02:42.023 CC lib/blob/blob_bs_dev.o 00:02:42.023 CC lib/virtio/virtio_pci.o 00:02:42.023 CC lib/init/json_config.o 00:02:42.023 CC lib/init/rpc.o 00:02:42.023 CC lib/init/subsystem.o 00:02:42.023 CC lib/init/subsystem_rpc.o 00:02:42.023 CC lib/accel/accel_sw.o 00:02:42.023 CC lib/accel/accel.o 00:02:42.023 CC lib/accel/accel_rpc.o 00:02:42.023 CC lib/vfu_tgt/tgt_endpoint.o 00:02:42.023 CC lib/vfu_tgt/tgt_rpc.o 00:02:42.282 LIB libspdk_init.a 00:02:42.282 LIB libspdk_virtio.a 00:02:42.282 LIB libspdk_vfu_tgt.a 00:02:42.282 LIB libspdk_nvme.a 00:02:42.540 CC lib/event/app.o 00:02:42.540 CC lib/event/reactor.o 00:02:42.540 CC lib/event/log_rpc.o 00:02:42.540 CC lib/event/app_rpc.o 00:02:42.540 CC lib/event/scheduler_static.o 00:02:42.799 LIB libspdk_event.a 00:02:42.799 LIB libspdk_accel.a 00:02:43.059 CC lib/bdev/bdev.o 00:02:43.059 CC lib/bdev/bdev_rpc.o 00:02:43.059 CC lib/bdev/bdev_zone.o 00:02:43.059 CC lib/bdev/part.o 00:02:43.059 CC lib/bdev/scsi_nvme.o 00:02:43.625 LIB libspdk_blob.a 00:02:43.883 CC lib/lvol/lvol.o 00:02:43.883 CC lib/blobfs/blobfs.o 00:02:43.883 CC lib/blobfs/tree.o 00:02:44.450 LIB libspdk_lvol.a 00:02:44.450 LIB libspdk_blobfs.a 00:02:44.707 LIB libspdk_bdev.a 00:02:44.964 CC lib/ublk/ublk.o 00:02:44.964 CC lib/ublk/ublk_rpc.o 00:02:44.964 CC lib/nbd/nbd.o 00:02:44.964 CC lib/nbd/nbd_rpc.o 00:02:44.964 CC lib/nvmf/ctrlr.o 00:02:44.964 CC lib/nvmf/ctrlr_bdev.o 00:02:44.964 CC lib/scsi/dev.o 00:02:44.964 CC lib/nvmf/ctrlr_discovery.o 00:02:44.964 CC lib/ftl/ftl_core.o 00:02:44.964 CC lib/nvmf/subsystem.o 00:02:44.964 CC lib/scsi/lun.o 00:02:44.964 CC lib/ftl/ftl_init.o 00:02:44.964 CC lib/scsi/port.o 00:02:44.964 CC lib/nvmf/nvmf.o 00:02:44.964 CC lib/ftl/ftl_layout.o 00:02:44.964 CC lib/scsi/scsi.o 00:02:44.964 CC lib/ftl/ftl_debug.o 00:02:44.964 CC lib/nvmf/nvmf_rpc.o 00:02:44.964 CC lib/scsi/scsi_bdev.o 00:02:44.964 CC lib/ftl/ftl_io.o 00:02:44.964 CC lib/ftl/ftl_sb.o 00:02:44.964 CC lib/scsi/scsi_pr.o 00:02:44.964 CC lib/scsi/task.o 00:02:44.964 CC lib/scsi/scsi_rpc.o 00:02:44.964 CC lib/nvmf/transport.o 00:02:44.964 CC lib/ftl/ftl_l2p.o 00:02:44.964 CC lib/nvmf/tcp.o 00:02:44.965 CC lib/ftl/ftl_l2p_flat.o 00:02:44.965 CC lib/ftl/ftl_nv_cache.o 00:02:44.965 CC lib/nvmf/vfio_user.o 00:02:44.965 CC lib/ftl/ftl_band.o 00:02:44.965 CC lib/nvmf/rdma.o 00:02:44.965 CC lib/ftl/ftl_band_ops.o 00:02:44.965 CC lib/ftl/ftl_writer.o 00:02:44.965 CC lib/ftl/ftl_rq.o 00:02:44.965 CC lib/ftl/ftl_reloc.o 00:02:44.965 CC lib/ftl/ftl_l2p_cache.o 00:02:44.965 CC lib/ftl/ftl_p2l.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:44.965 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:44.965 CC lib/ftl/utils/ftl_conf.o 00:02:44.965 CC lib/ftl/utils/ftl_md.o 00:02:44.965 CC lib/ftl/utils/ftl_bitmap.o 00:02:44.965 CC lib/ftl/utils/ftl_property.o 00:02:44.965 CC lib/ftl/utils/ftl_mempool.o 00:02:44.965 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:44.965 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:44.965 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:44.965 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:44.965 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:44.965 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:44.965 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:44.965 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:44.965 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:44.965 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:44.965 CC lib/ftl/base/ftl_base_dev.o 00:02:44.965 CC lib/ftl/base/ftl_base_bdev.o 00:02:44.965 CC lib/ftl/ftl_trace.o 00:02:45.223 LIB libspdk_nbd.a 00:02:45.482 LIB libspdk_scsi.a 00:02:45.482 LIB libspdk_ublk.a 00:02:45.482 LIB libspdk_ftl.a 00:02:45.741 CC lib/iscsi/conn.o 00:02:45.741 CC lib/iscsi/init_grp.o 00:02:45.741 CC lib/iscsi/iscsi.o 00:02:45.741 CC lib/iscsi/param.o 00:02:45.741 CC lib/iscsi/portal_grp.o 00:02:45.741 CC lib/iscsi/md5.o 00:02:45.741 CC lib/iscsi/tgt_node.o 00:02:45.741 CC lib/iscsi/iscsi_subsystem.o 00:02:45.741 CC lib/iscsi/iscsi_rpc.o 00:02:45.741 CC lib/iscsi/task.o 00:02:45.741 CC lib/vhost/vhost.o 00:02:45.741 CC lib/vhost/vhost_scsi.o 00:02:45.741 CC lib/vhost/vhost_rpc.o 00:02:45.741 CC lib/vhost/vhost_blk.o 00:02:45.741 CC lib/vhost/rte_vhost_user.o 00:02:46.309 LIB libspdk_nvmf.a 00:02:46.309 LIB libspdk_vhost.a 00:02:46.309 LIB libspdk_iscsi.a 00:02:46.877 CC module/env_dpdk/env_dpdk_rpc.o 00:02:46.877 CC module/vfu_device/vfu_virtio.o 00:02:46.877 CC module/vfu_device/vfu_virtio_blk.o 00:02:46.877 CC module/vfu_device/vfu_virtio_scsi.o 00:02:46.877 CC module/vfu_device/vfu_virtio_rpc.o 00:02:46.877 CC module/sock/posix/posix.o 00:02:46.877 LIB libspdk_env_dpdk_rpc.a 00:02:46.877 CC module/scheduler/gscheduler/gscheduler.o 00:02:46.877 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:46.877 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:46.877 CC module/accel/iaa/accel_iaa_rpc.o 00:02:46.877 CC module/accel/iaa/accel_iaa.o 00:02:46.877 CC module/accel/error/accel_error.o 00:02:46.877 CC module/accel/ioat/accel_ioat_rpc.o 00:02:46.877 CC module/accel/error/accel_error_rpc.o 00:02:46.877 CC module/accel/ioat/accel_ioat.o 00:02:46.877 CC module/blob/bdev/blob_bdev.o 00:02:46.877 CC module/accel/dsa/accel_dsa.o 00:02:46.877 CC module/accel/dsa/accel_dsa_rpc.o 00:02:47.136 LIB libspdk_scheduler_dpdk_governor.a 00:02:47.136 LIB libspdk_scheduler_gscheduler.a 00:02:47.136 LIB libspdk_scheduler_dynamic.a 00:02:47.136 LIB libspdk_accel_error.a 00:02:47.136 LIB libspdk_accel_ioat.a 00:02:47.136 LIB libspdk_accel_iaa.a 00:02:47.136 LIB libspdk_blob_bdev.a 00:02:47.136 LIB libspdk_accel_dsa.a 00:02:47.136 LIB libspdk_vfu_device.a 00:02:47.395 LIB libspdk_sock_posix.a 00:02:47.395 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:47.395 CC module/bdev/nvme/bdev_nvme.o 00:02:47.395 CC module/bdev/nvme/nvme_rpc.o 00:02:47.395 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:47.395 CC module/bdev/nvme/bdev_mdns_client.o 00:02:47.395 CC module/bdev/nvme/vbdev_opal.o 00:02:47.395 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:47.654 CC module/bdev/split/vbdev_split_rpc.o 00:02:47.654 CC module/bdev/split/vbdev_split.o 00:02:47.654 CC module/bdev/malloc/bdev_malloc.o 00:02:47.654 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:47.654 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:47.654 CC module/bdev/passthru/vbdev_passthru.o 00:02:47.654 CC module/bdev/null/bdev_null_rpc.o 00:02:47.654 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:47.654 CC module/bdev/null/bdev_null.o 00:02:47.654 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:47.654 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:47.654 CC module/bdev/gpt/gpt.o 00:02:47.654 CC module/bdev/gpt/vbdev_gpt.o 00:02:47.654 CC module/bdev/error/vbdev_error.o 00:02:47.654 CC module/bdev/aio/bdev_aio.o 00:02:47.654 CC module/bdev/error/vbdev_error_rpc.o 00:02:47.654 CC module/bdev/aio/bdev_aio_rpc.o 00:02:47.654 CC module/bdev/raid/bdev_raid.o 00:02:47.654 CC module/bdev/raid/bdev_raid_sb.o 00:02:47.655 CC module/bdev/raid/bdev_raid_rpc.o 00:02:47.655 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:47.655 CC module/bdev/raid/raid1.o 00:02:47.655 CC module/bdev/raid/raid0.o 00:02:47.655 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:47.655 CC module/bdev/ftl/bdev_ftl.o 00:02:47.655 CC module/bdev/raid/concat.o 00:02:47.655 CC module/bdev/delay/vbdev_delay.o 00:02:47.655 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:47.655 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:47.655 CC module/bdev/lvol/vbdev_lvol.o 00:02:47.655 CC module/blobfs/bdev/blobfs_bdev.o 00:02:47.655 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:47.655 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:47.655 CC module/bdev/iscsi/bdev_iscsi.o 00:02:47.655 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:47.655 LIB libspdk_blobfs_bdev.a 00:02:47.655 LIB libspdk_bdev_split.a 00:02:47.655 LIB libspdk_bdev_gpt.a 00:02:47.655 LIB libspdk_bdev_null.a 00:02:47.655 LIB libspdk_bdev_passthru.a 00:02:47.655 LIB libspdk_bdev_error.a 00:02:47.655 LIB libspdk_bdev_ftl.a 00:02:47.655 LIB libspdk_bdev_aio.a 00:02:47.655 LIB libspdk_bdev_malloc.a 00:02:47.914 LIB libspdk_bdev_delay.a 00:02:47.914 LIB libspdk_bdev_zone_block.a 00:02:47.914 LIB libspdk_bdev_iscsi.a 00:02:47.914 LIB libspdk_bdev_virtio.a 00:02:47.914 LIB libspdk_bdev_lvol.a 00:02:48.172 LIB libspdk_bdev_raid.a 00:02:48.741 LIB libspdk_bdev_nvme.a 00:02:49.309 CC module/event/subsystems/iobuf/iobuf.o 00:02:49.309 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:49.309 CC module/event/subsystems/sock/sock.o 00:02:49.309 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:49.309 CC module/event/subsystems/scheduler/scheduler.o 00:02:49.309 CC module/event/subsystems/vfu_tgt/vfu_tgt.o 00:02:49.309 CC module/event/subsystems/vmd/vmd.o 00:02:49.309 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:49.309 LIB libspdk_event_sock.a 00:02:49.309 LIB libspdk_event_iobuf.a 00:02:49.309 LIB libspdk_event_vhost_blk.a 00:02:49.309 LIB libspdk_event_vfu_tgt.a 00:02:49.309 LIB libspdk_event_vmd.a 00:02:49.309 LIB libspdk_event_scheduler.a 00:02:49.569 CC module/event/subsystems/accel/accel.o 00:02:49.828 LIB libspdk_event_accel.a 00:02:50.088 CC module/event/subsystems/bdev/bdev.o 00:02:50.088 LIB libspdk_event_bdev.a 00:02:50.347 CC module/event/subsystems/nbd/nbd.o 00:02:50.347 CC module/event/subsystems/ublk/ublk.o 00:02:50.347 CC module/event/subsystems/scsi/scsi.o 00:02:50.347 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:50.347 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:50.606 LIB libspdk_event_nbd.a 00:02:50.606 LIB libspdk_event_ublk.a 00:02:50.606 LIB libspdk_event_scsi.a 00:02:50.606 LIB libspdk_event_nvmf.a 00:02:50.865 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:50.865 CC module/event/subsystems/iscsi/iscsi.o 00:02:50.865 LIB libspdk_event_vhost_scsi.a 00:02:50.865 LIB libspdk_event_iscsi.a 00:02:51.124 CC test/rpc_client/rpc_client_test.o 00:02:51.124 TEST_HEADER include/spdk/accel.h 00:02:51.124 TEST_HEADER include/spdk/barrier.h 00:02:51.124 TEST_HEADER include/spdk/accel_module.h 00:02:51.394 TEST_HEADER include/spdk/assert.h 00:02:51.394 TEST_HEADER include/spdk/bdev.h 00:02:51.394 TEST_HEADER include/spdk/base64.h 00:02:51.394 TEST_HEADER include/spdk/bdev_zone.h 00:02:51.394 TEST_HEADER include/spdk/bdev_module.h 00:02:51.394 TEST_HEADER include/spdk/bit_pool.h 00:02:51.394 TEST_HEADER include/spdk/bit_array.h 00:02:51.394 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:51.394 TEST_HEADER include/spdk/blobfs.h 00:02:51.394 TEST_HEADER include/spdk/blob.h 00:02:51.394 TEST_HEADER include/spdk/blob_bdev.h 00:02:51.394 TEST_HEADER include/spdk/config.h 00:02:51.394 TEST_HEADER include/spdk/conf.h 00:02:51.394 TEST_HEADER include/spdk/cpuset.h 00:02:51.394 TEST_HEADER include/spdk/crc16.h 00:02:51.394 TEST_HEADER include/spdk/crc32.h 00:02:51.394 TEST_HEADER include/spdk/crc64.h 00:02:51.394 TEST_HEADER include/spdk/dif.h 00:02:51.394 TEST_HEADER include/spdk/dma.h 00:02:51.394 TEST_HEADER include/spdk/env_dpdk.h 00:02:51.394 TEST_HEADER include/spdk/env.h 00:02:51.394 TEST_HEADER include/spdk/endian.h 00:02:51.394 TEST_HEADER include/spdk/fd_group.h 00:02:51.394 TEST_HEADER include/spdk/fd.h 00:02:51.394 TEST_HEADER include/spdk/event.h 00:02:51.394 CC app/spdk_nvme_perf/perf.o 00:02:51.394 TEST_HEADER include/spdk/ftl.h 00:02:51.394 TEST_HEADER include/spdk/file.h 00:02:51.394 TEST_HEADER include/spdk/gpt_spec.h 00:02:51.394 CXX app/trace/trace.o 00:02:51.394 TEST_HEADER include/spdk/hexlify.h 00:02:51.394 TEST_HEADER include/spdk/idxd.h 00:02:51.394 TEST_HEADER include/spdk/histogram_data.h 00:02:51.394 TEST_HEADER include/spdk/idxd_spec.h 00:02:51.394 CC app/spdk_lspci/spdk_lspci.o 00:02:51.394 CC app/spdk_nvme_discover/discovery_aer.o 00:02:51.394 TEST_HEADER include/spdk/ioat.h 00:02:51.394 TEST_HEADER include/spdk/init.h 00:02:51.394 TEST_HEADER include/spdk/ioat_spec.h 00:02:51.394 CC app/spdk_top/spdk_top.o 00:02:51.394 TEST_HEADER include/spdk/json.h 00:02:51.394 TEST_HEADER include/spdk/iscsi_spec.h 00:02:51.394 TEST_HEADER include/spdk/jsonrpc.h 00:02:51.394 TEST_HEADER include/spdk/likely.h 00:02:51.394 TEST_HEADER include/spdk/log.h 00:02:51.394 TEST_HEADER include/spdk/memory.h 00:02:51.394 TEST_HEADER include/spdk/lvol.h 00:02:51.394 CC app/spdk_nvme_identify/identify.o 00:02:51.394 TEST_HEADER include/spdk/mmio.h 00:02:51.394 TEST_HEADER include/spdk/nbd.h 00:02:51.394 CC app/trace_record/trace_record.o 00:02:51.394 TEST_HEADER include/spdk/notify.h 00:02:51.394 TEST_HEADER include/spdk/nvme.h 00:02:51.394 TEST_HEADER include/spdk/nvme_intel.h 00:02:51.394 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:51.394 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:51.394 TEST_HEADER include/spdk/nvme_spec.h 00:02:51.394 TEST_HEADER include/spdk/nvme_zns.h 00:02:51.394 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:51.394 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:51.394 TEST_HEADER include/spdk/nvmf.h 00:02:51.394 TEST_HEADER include/spdk/nvmf_spec.h 00:02:51.394 TEST_HEADER include/spdk/nvmf_transport.h 00:02:51.394 TEST_HEADER include/spdk/opal.h 00:02:51.394 TEST_HEADER include/spdk/pci_ids.h 00:02:51.394 TEST_HEADER include/spdk/opal_spec.h 00:02:51.394 TEST_HEADER include/spdk/pipe.h 00:02:51.394 TEST_HEADER include/spdk/queue.h 00:02:51.394 TEST_HEADER include/spdk/reduce.h 00:02:51.394 TEST_HEADER include/spdk/rpc.h 00:02:51.394 TEST_HEADER include/spdk/scsi.h 00:02:51.394 TEST_HEADER include/spdk/scheduler.h 00:02:51.394 TEST_HEADER include/spdk/sock.h 00:02:51.394 TEST_HEADER include/spdk/scsi_spec.h 00:02:51.394 TEST_HEADER include/spdk/string.h 00:02:51.394 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:51.394 TEST_HEADER include/spdk/stdinc.h 00:02:51.394 TEST_HEADER include/spdk/trace.h 00:02:51.394 TEST_HEADER include/spdk/thread.h 00:02:51.394 TEST_HEADER include/spdk/trace_parser.h 00:02:51.394 TEST_HEADER include/spdk/tree.h 00:02:51.394 TEST_HEADER include/spdk/ublk.h 00:02:51.394 TEST_HEADER include/spdk/uuid.h 00:02:51.394 TEST_HEADER include/spdk/util.h 00:02:51.394 TEST_HEADER include/spdk/version.h 00:02:51.394 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:51.394 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:51.394 TEST_HEADER include/spdk/vhost.h 00:02:51.394 CC app/nvmf_tgt/nvmf_main.o 00:02:51.394 TEST_HEADER include/spdk/vmd.h 00:02:51.394 TEST_HEADER include/spdk/xor.h 00:02:51.394 TEST_HEADER include/spdk/zipf.h 00:02:51.394 CXX test/cpp_headers/accel.o 00:02:51.394 CXX test/cpp_headers/assert.o 00:02:51.394 CXX test/cpp_headers/accel_module.o 00:02:51.394 CXX test/cpp_headers/barrier.o 00:02:51.394 CXX test/cpp_headers/base64.o 00:02:51.394 CXX test/cpp_headers/bdev.o 00:02:51.394 CXX test/cpp_headers/bdev_module.o 00:02:51.394 CXX test/cpp_headers/bdev_zone.o 00:02:51.394 CXX test/cpp_headers/blob_bdev.o 00:02:51.394 CXX test/cpp_headers/bit_array.o 00:02:51.394 CXX test/cpp_headers/bit_pool.o 00:02:51.394 CXX test/cpp_headers/blobfs_bdev.o 00:02:51.394 CC app/vhost/vhost.o 00:02:51.394 CXX test/cpp_headers/blobfs.o 00:02:51.394 CXX test/cpp_headers/blob.o 00:02:51.394 CXX test/cpp_headers/conf.o 00:02:51.394 CXX test/cpp_headers/config.o 00:02:51.394 CXX test/cpp_headers/cpuset.o 00:02:51.394 CXX test/cpp_headers/crc16.o 00:02:51.394 CXX test/cpp_headers/crc32.o 00:02:51.394 CXX test/cpp_headers/crc64.o 00:02:51.394 CXX test/cpp_headers/dif.o 00:02:51.394 CXX test/cpp_headers/dma.o 00:02:51.394 CXX test/cpp_headers/endian.o 00:02:51.394 CXX test/cpp_headers/env_dpdk.o 00:02:51.394 CXX test/cpp_headers/env.o 00:02:51.394 CXX test/cpp_headers/event.o 00:02:51.394 CC app/iscsi_tgt/iscsi_tgt.o 00:02:51.394 CXX test/cpp_headers/fd.o 00:02:51.394 CXX test/cpp_headers/fd_group.o 00:02:51.394 CXX test/cpp_headers/file.o 00:02:51.394 CXX test/cpp_headers/ftl.o 00:02:51.394 CC app/spdk_dd/spdk_dd.o 00:02:51.394 CXX test/cpp_headers/gpt_spec.o 00:02:51.394 CXX test/cpp_headers/hexlify.o 00:02:51.394 CXX test/cpp_headers/histogram_data.o 00:02:51.394 CXX test/cpp_headers/idxd.o 00:02:51.394 CXX test/cpp_headers/idxd_spec.o 00:02:51.394 CC app/spdk_tgt/spdk_tgt.o 00:02:51.394 CXX test/cpp_headers/init.o 00:02:51.394 CC test/nvme/sgl/sgl.o 00:02:51.394 CC test/nvme/connect_stress/connect_stress.o 00:02:51.394 CC test/nvme/aer/aer.o 00:02:51.394 CC test/nvme/reset/reset.o 00:02:51.394 CC test/env/pci/pci_ut.o 00:02:51.394 CC test/nvme/err_injection/err_injection.o 00:02:51.394 CC test/env/vtophys/vtophys.o 00:02:51.394 CC test/nvme/startup/startup.o 00:02:51.394 CC test/event/reactor_perf/reactor_perf.o 00:02:51.394 CC test/event/event_perf/event_perf.o 00:02:51.394 CC test/nvme/compliance/nvme_compliance.o 00:02:51.394 CC test/nvme/e2edp/nvme_dp.o 00:02:51.394 CC test/nvme/reserve/reserve.o 00:02:51.394 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:51.394 CC test/app/histogram_perf/histogram_perf.o 00:02:51.394 CC test/nvme/cuse/cuse.o 00:02:51.394 CC test/nvme/boot_partition/boot_partition.o 00:02:51.394 CC test/nvme/fdp/fdp.o 00:02:51.394 CC test/env/memory/memory_ut.o 00:02:51.394 CC test/nvme/overhead/overhead.o 00:02:51.394 CC test/nvme/simple_copy/simple_copy.o 00:02:51.394 CC test/thread/lock/spdk_lock.o 00:02:51.394 CC test/nvme/fused_ordering/fused_ordering.o 00:02:51.394 CXX test/cpp_headers/ioat.o 00:02:51.394 CC test/event/reactor/reactor.o 00:02:51.394 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:51.394 CC test/app/stub/stub.o 00:02:51.394 CC examples/ioat/verify/verify.o 00:02:51.394 CC test/thread/poller_perf/poller_perf.o 00:02:51.394 CC test/app/jsoncat/jsoncat.o 00:02:51.395 CC examples/ioat/perf/perf.o 00:02:51.395 CC test/event/app_repeat/app_repeat.o 00:02:51.395 CC test/dma/test_dma/test_dma.o 00:02:51.395 CC examples/util/zipf/zipf.o 00:02:51.395 CC test/accel/dif/dif.o 00:02:51.395 CC examples/accel/perf/accel_perf.o 00:02:51.395 CC examples/nvme/hello_world/hello_world.o 00:02:51.395 CC examples/vmd/lsvmd/lsvmd.o 00:02:51.395 CC examples/vmd/led/led.o 00:02:51.395 CC examples/sock/hello_world/hello_sock.o 00:02:51.395 CC examples/nvme/arbitration/arbitration.o 00:02:51.395 CC examples/nvme/hotplug/hotplug.o 00:02:51.395 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:51.395 CC examples/nvme/reconnect/reconnect.o 00:02:51.395 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:51.395 CC test/event/scheduler/scheduler.o 00:02:51.395 CC examples/idxd/perf/perf.o 00:02:51.395 CC examples/nvme/abort/abort.o 00:02:51.395 CC test/blobfs/mkfs/mkfs.o 00:02:51.395 CC app/fio/nvme/fio_plugin.o 00:02:51.395 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:51.395 LINK rpc_client_test 00:02:51.395 LINK spdk_lspci 00:02:51.395 CC examples/blob/cli/blobcli.o 00:02:51.395 CC test/app/bdev_svc/bdev_svc.o 00:02:51.395 CC examples/thread/thread/thread_ex.o 00:02:51.395 CC test/bdev/bdevio/bdevio.o 00:02:51.395 CC examples/blob/hello_world/hello_blob.o 00:02:51.395 CC test/env/mem_callbacks/mem_callbacks.o 00:02:51.395 CC test/lvol/esnap/esnap.o 00:02:51.395 CC examples/bdev/hello_world/hello_bdev.o 00:02:51.395 CC examples/nvmf/nvmf/nvmf.o 00:02:51.395 CC app/fio/bdev/fio_plugin.o 00:02:51.395 CC examples/bdev/bdevperf/bdevperf.o 00:02:51.658 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:51.658 LINK spdk_nvme_discover 00:02:51.658 CXX test/cpp_headers/ioat_spec.o 00:02:51.658 LINK interrupt_tgt 00:02:51.658 CXX test/cpp_headers/iscsi_spec.o 00:02:51.658 CXX test/cpp_headers/json.o 00:02:51.658 CXX test/cpp_headers/jsonrpc.o 00:02:51.658 CXX test/cpp_headers/likely.o 00:02:51.658 CXX test/cpp_headers/log.o 00:02:51.658 CXX test/cpp_headers/lvol.o 00:02:51.658 LINK reactor_perf 00:02:51.658 CXX test/cpp_headers/memory.o 00:02:51.658 CXX test/cpp_headers/mmio.o 00:02:51.658 CXX test/cpp_headers/nbd.o 00:02:51.658 LINK reactor 00:02:51.658 CXX test/cpp_headers/notify.o 00:02:51.658 LINK spdk_trace_record 00:02:51.658 CXX test/cpp_headers/nvme.o 00:02:51.658 LINK vtophys 00:02:51.658 CXX test/cpp_headers/nvme_intel.o 00:02:51.658 CXX test/cpp_headers/nvme_ocssd.o 00:02:51.658 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:51.658 CXX test/cpp_headers/nvme_spec.o 00:02:51.658 CXX test/cpp_headers/nvme_zns.o 00:02:51.658 CXX test/cpp_headers/nvmf_cmd.o 00:02:51.658 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:51.658 CXX test/cpp_headers/nvmf.o 00:02:51.658 CXX test/cpp_headers/nvmf_spec.o 00:02:51.658 LINK jsoncat 00:02:51.658 CXX test/cpp_headers/nvmf_transport.o 00:02:51.658 CXX test/cpp_headers/opal.o 00:02:51.658 CXX test/cpp_headers/opal_spec.o 00:02:51.658 LINK histogram_perf 00:02:51.658 LINK nvmf_tgt 00:02:51.658 LINK env_dpdk_post_init 00:02:51.658 LINK poller_perf 00:02:51.658 LINK event_perf 00:02:51.658 LINK vhost 00:02:51.658 CXX test/cpp_headers/pci_ids.o 00:02:51.658 CXX test/cpp_headers/pipe.o 00:02:51.658 LINK lsvmd 00:02:51.658 CXX test/cpp_headers/queue.o 00:02:51.658 CXX test/cpp_headers/reduce.o 00:02:51.658 CXX test/cpp_headers/rpc.o 00:02:51.658 CXX test/cpp_headers/scheduler.o 00:02:51.658 LINK led 00:02:51.658 LINK connect_stress 00:02:51.658 CXX test/cpp_headers/scsi.o 00:02:51.658 LINK zipf 00:02:51.658 LINK app_repeat 00:02:51.658 LINK startup 00:02:51.658 LINK boot_partition 00:02:51.658 LINK err_injection 00:02:51.658 LINK iscsi_tgt 00:02:51.658 LINK stub 00:02:51.658 CXX test/cpp_headers/scsi_spec.o 00:02:51.658 CXX test/cpp_headers/sock.o 00:02:51.658 LINK doorbell_aers 00:02:51.658 LINK spdk_tgt 00:02:51.658 LINK fused_ordering 00:02:51.658 LINK reserve 00:02:51.658 LINK pmr_persistence 00:02:51.658 CXX test/cpp_headers/stdinc.o 00:02:51.658 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:51.658 LINK simple_copy 00:02:51.658 LINK cmb_copy 00:02:51.658 LINK verify 00:02:51.658 LINK ioat_perf 00:02:51.658 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:51.658 LINK mkfs 00:02:51.658 LINK reset 00:02:51.658 LINK hello_world 00:02:51.658 LINK bdev_svc 00:02:51.658 LINK hotplug 00:02:51.658 LINK sgl 00:02:51.658 LINK hello_sock 00:02:51.658 LINK aer 00:02:51.658 LINK nvme_dp 00:02:51.658 LINK scheduler 00:02:51.658 CC test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.o 00:02:51.658 LINK fdp 00:02:51.658 CC test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.o 00:02:51.658 LINK overhead 00:02:51.658 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:51.921 CXX test/cpp_headers/string.o 00:02:51.921 LINK spdk_trace 00:02:51.921 CXX test/cpp_headers/thread.o 00:02:51.921 LINK hello_blob 00:02:51.921 CXX test/cpp_headers/trace.o 00:02:51.921 CXX test/cpp_headers/trace_parser.o 00:02:51.921 CXX test/cpp_headers/tree.o 00:02:51.921 CXX test/cpp_headers/ublk.o 00:02:51.921 CXX test/cpp_headers/util.o 00:02:51.921 CXX test/cpp_headers/uuid.o 00:02:51.921 LINK thread 00:02:51.921 CXX test/cpp_headers/version.o 00:02:51.921 CXX test/cpp_headers/vfio_user_pci.o 00:02:51.921 LINK hello_bdev 00:02:51.921 CXX test/cpp_headers/vfio_user_spec.o 00:02:51.921 CXX test/cpp_headers/vhost.o 00:02:51.921 CXX test/cpp_headers/vmd.o 00:02:51.921 CXX test/cpp_headers/xor.o 00:02:51.921 CXX test/cpp_headers/zipf.o 00:02:51.921 LINK reconnect 00:02:51.921 LINK idxd_perf 00:02:51.921 LINK nvmf 00:02:51.921 LINK test_dma 00:02:51.921 LINK arbitration 00:02:51.921 LINK abort 00:02:51.921 LINK dif 00:02:51.921 LINK spdk_dd 00:02:51.921 LINK pci_ut 00:02:51.921 LINK nvme_compliance 00:02:51.921 LINK bdevio 00:02:51.921 LINK nvme_fuzz 00:02:51.921 LINK accel_perf 00:02:52.179 LINK nvme_manage 00:02:52.179 LINK llvm_vfio_fuzz 00:02:52.179 LINK mem_callbacks 00:02:52.179 LINK blobcli 00:02:52.179 LINK spdk_nvme 00:02:52.179 LINK spdk_bdev 00:02:52.179 LINK bdevperf 00:02:52.179 LINK spdk_nvme_identify 00:02:52.179 LINK spdk_nvme_perf 00:02:52.437 LINK vhost_fuzz 00:02:52.437 LINK memory_ut 00:02:52.437 LINK spdk_top 00:02:52.437 LINK cuse 00:02:52.695 LINK llvm_nvme_fuzz 00:02:52.953 LINK spdk_lock 00:02:53.212 LINK iscsi_fuzz 00:02:55.118 LINK esnap 00:02:55.377 00:02:55.377 real 0m41.448s 00:02:55.377 user 5m43.638s 00:02:55.377 sys 2m48.580s 00:02:55.377 04:46:30 -- common/autotest_common.sh@1115 -- $ xtrace_disable 00:02:55.377 04:46:30 -- common/autotest_common.sh@10 -- $ set +x 00:02:55.377 ************************************ 00:02:55.377 END TEST make 00:02:55.377 ************************************ 00:02:55.377 04:46:30 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:02:55.377 04:46:30 -- common/autotest_common.sh@1690 -- # lcov --version 00:02:55.377 04:46:30 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:02:55.377 04:46:30 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:02:55.377 04:46:30 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:02:55.377 04:46:30 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:02:55.377 04:46:30 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:02:55.377 04:46:30 -- scripts/common.sh@335 -- # IFS=.-: 00:02:55.377 04:46:30 -- scripts/common.sh@335 -- # read -ra ver1 00:02:55.377 04:46:30 -- scripts/common.sh@336 -- # IFS=.-: 00:02:55.377 04:46:30 -- scripts/common.sh@336 -- # read -ra ver2 00:02:55.377 04:46:30 -- scripts/common.sh@337 -- # local 'op=<' 00:02:55.377 04:46:30 -- scripts/common.sh@339 -- # ver1_l=2 00:02:55.377 04:46:30 -- scripts/common.sh@340 -- # ver2_l=1 00:02:55.377 04:46:30 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:02:55.377 04:46:30 -- scripts/common.sh@343 -- # case "$op" in 00:02:55.377 04:46:30 -- scripts/common.sh@344 -- # : 1 00:02:55.377 04:46:30 -- scripts/common.sh@363 -- # (( v = 0 )) 00:02:55.377 04:46:30 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:55.377 04:46:30 -- scripts/common.sh@364 -- # decimal 1 00:02:55.377 04:46:30 -- scripts/common.sh@352 -- # local d=1 00:02:55.377 04:46:30 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:02:55.377 04:46:30 -- scripts/common.sh@354 -- # echo 1 00:02:55.377 04:46:30 -- scripts/common.sh@364 -- # ver1[v]=1 00:02:55.377 04:46:30 -- scripts/common.sh@365 -- # decimal 2 00:02:55.377 04:46:30 -- scripts/common.sh@352 -- # local d=2 00:02:55.377 04:46:30 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:02:55.377 04:46:30 -- scripts/common.sh@354 -- # echo 2 00:02:55.377 04:46:30 -- scripts/common.sh@365 -- # ver2[v]=2 00:02:55.377 04:46:30 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:02:55.377 04:46:30 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:02:55.377 04:46:30 -- scripts/common.sh@367 -- # return 0 00:02:55.377 04:46:30 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:02:55.377 04:46:30 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:02:55.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:55.377 --rc genhtml_branch_coverage=1 00:02:55.377 --rc genhtml_function_coverage=1 00:02:55.377 --rc genhtml_legend=1 00:02:55.377 --rc geninfo_all_blocks=1 00:02:55.377 --rc geninfo_unexecuted_blocks=1 00:02:55.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:55.377 ' 00:02:55.377 04:46:30 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:02:55.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:55.377 --rc genhtml_branch_coverage=1 00:02:55.377 --rc genhtml_function_coverage=1 00:02:55.377 --rc genhtml_legend=1 00:02:55.377 --rc geninfo_all_blocks=1 00:02:55.377 --rc geninfo_unexecuted_blocks=1 00:02:55.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:55.377 ' 00:02:55.377 04:46:30 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:02:55.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:55.377 --rc genhtml_branch_coverage=1 00:02:55.377 --rc genhtml_function_coverage=1 00:02:55.377 --rc genhtml_legend=1 00:02:55.377 --rc geninfo_all_blocks=1 00:02:55.377 --rc geninfo_unexecuted_blocks=1 00:02:55.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:55.377 ' 00:02:55.377 04:46:30 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:02:55.377 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:02:55.377 --rc genhtml_branch_coverage=1 00:02:55.377 --rc genhtml_function_coverage=1 00:02:55.377 --rc genhtml_legend=1 00:02:55.377 --rc geninfo_all_blocks=1 00:02:55.377 --rc geninfo_unexecuted_blocks=1 00:02:55.377 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:02:55.377 ' 00:02:55.377 04:46:30 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:02:55.377 04:46:30 -- nvmf/common.sh@7 -- # uname -s 00:02:55.378 04:46:30 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:55.378 04:46:30 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:55.378 04:46:30 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:55.378 04:46:30 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:55.378 04:46:30 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:55.378 04:46:30 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:55.378 04:46:30 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:55.378 04:46:30 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:55.378 04:46:30 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:55.637 04:46:30 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:55.637 04:46:30 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:02:55.637 04:46:30 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:02:55.637 04:46:30 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:55.637 04:46:30 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:55.637 04:46:30 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:55.637 04:46:30 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:02:55.637 04:46:30 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:55.637 04:46:30 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:55.637 04:46:30 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:55.637 04:46:30 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.637 04:46:30 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.637 04:46:30 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.637 04:46:30 -- paths/export.sh@5 -- # export PATH 00:02:55.637 04:46:30 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:55.637 04:46:30 -- nvmf/common.sh@46 -- # : 0 00:02:55.637 04:46:30 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:02:55.637 04:46:30 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:02:55.637 04:46:30 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:02:55.637 04:46:30 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:55.637 04:46:30 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:55.637 04:46:30 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:02:55.637 04:46:30 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:02:55.637 04:46:30 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:02:55.637 04:46:30 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:55.637 04:46:30 -- spdk/autotest.sh@32 -- # uname -s 00:02:55.637 04:46:30 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:55.637 04:46:30 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:55.637 04:46:30 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:55.637 04:46:30 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:55.637 04:46:30 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/coredumps 00:02:55.637 04:46:30 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:55.637 04:46:30 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:55.637 04:46:30 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:55.637 04:46:30 -- spdk/autotest.sh@48 -- # udevadm_pid=3604356 00:02:55.637 04:46:30 -- spdk/autotest.sh@51 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:55.637 04:46:30 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:55.637 04:46:30 -- spdk/autotest.sh@54 -- # echo 3604358 00:02:55.637 04:46:30 -- spdk/autotest.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:55.637 04:46:30 -- spdk/autotest.sh@56 -- # echo 3604359 00:02:55.637 04:46:30 -- spdk/autotest.sh@55 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power 00:02:55.637 04:46:30 -- spdk/autotest.sh@58 -- # [[ ............................... != QEMU ]] 00:02:55.637 04:46:30 -- spdk/autotest.sh@60 -- # echo 3604360 00:02:55.637 04:46:30 -- spdk/autotest.sh@59 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:55.637 04:46:30 -- spdk/autotest.sh@62 -- # echo 3604361 00:02:55.637 04:46:30 -- spdk/autotest.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power -l 00:02:55.637 04:46:30 -- spdk/autotest.sh@66 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:55.637 04:46:30 -- spdk/autotest.sh@68 -- # timing_enter autotest 00:02:55.637 04:46:30 -- common/autotest_common.sh@722 -- # xtrace_disable 00:02:55.637 04:46:30 -- common/autotest_common.sh@10 -- # set +x 00:02:55.637 04:46:30 -- spdk/autotest.sh@70 -- # create_test_list 00:02:55.637 04:46:30 -- common/autotest_common.sh@746 -- # xtrace_disable 00:02:55.637 04:46:30 -- common/autotest_common.sh@10 -- # set +x 00:02:55.637 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-bmc-pm.bmc.pm.log 00:02:55.637 04:46:30 -- spdk/autotest.sh@72 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/autotest.sh 00:02:55.637 Redirecting to /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/power/collect-cpu-temp.pm.log 00:02:55.637 04:46:30 -- spdk/autotest.sh@72 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:55.637 04:46:30 -- spdk/autotest.sh@72 -- # src=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:55.637 04:46:30 -- spdk/autotest.sh@73 -- # out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:02:55.637 04:46:30 -- spdk/autotest.sh@74 -- # cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:02:55.637 04:46:30 -- spdk/autotest.sh@76 -- # freebsd_update_contigmem_mod 00:02:55.637 04:46:30 -- common/autotest_common.sh@1450 -- # uname 00:02:55.637 04:46:30 -- common/autotest_common.sh@1450 -- # '[' Linux = FreeBSD ']' 00:02:55.637 04:46:30 -- spdk/autotest.sh@77 -- # freebsd_set_maxsock_buf 00:02:55.637 04:46:30 -- common/autotest_common.sh@1470 -- # uname 00:02:55.637 04:46:30 -- common/autotest_common.sh@1470 -- # [[ Linux = FreeBSD ]] 00:02:55.637 04:46:30 -- spdk/autotest.sh@79 -- # [[ y == y ]] 00:02:55.637 04:46:30 -- spdk/autotest.sh@81 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh --version 00:02:55.637 lcov: LCOV version 1.15 00:02:55.638 04:46:30 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -i -t Baseline -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info 00:03:03.825 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcno 00:03:03.825 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcno 00:03:03.825 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcno 00:03:10.397 04:46:44 -- spdk/autotest.sh@87 -- # timing_enter pre_cleanup 00:03:10.397 04:46:44 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:10.397 04:46:44 -- common/autotest_common.sh@10 -- # set +x 00:03:10.397 04:46:44 -- spdk/autotest.sh@89 -- # rm -f 00:03:10.397 04:46:44 -- spdk/autotest.sh@92 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:13.689 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:13.689 0000:d8:00.0 (8086 0a54): Already using the nvme driver 00:03:13.689 04:46:48 -- spdk/autotest.sh@94 -- # get_zoned_devs 00:03:13.689 04:46:48 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:13.689 04:46:48 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:13.689 04:46:48 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:13.689 04:46:48 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:13.689 04:46:48 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:13.689 04:46:48 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:13.689 04:46:48 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:13.689 04:46:48 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:13.689 04:46:48 -- spdk/autotest.sh@96 -- # (( 0 > 0 )) 00:03:13.689 04:46:48 -- spdk/autotest.sh@108 -- # ls /dev/nvme0n1 00:03:13.689 04:46:48 -- spdk/autotest.sh@108 -- # grep -v p 00:03:13.689 04:46:48 -- spdk/autotest.sh@108 -- # for dev in $(ls /dev/nvme*n* | grep -v p || true) 00:03:13.689 04:46:48 -- spdk/autotest.sh@110 -- # [[ -z '' ]] 00:03:13.689 04:46:48 -- spdk/autotest.sh@111 -- # block_in_use /dev/nvme0n1 00:03:13.689 04:46:48 -- scripts/common.sh@380 -- # local block=/dev/nvme0n1 pt 00:03:13.689 04:46:48 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:13.689 No valid GPT data, bailing 00:03:13.689 04:46:48 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:13.689 04:46:48 -- scripts/common.sh@393 -- # pt= 00:03:13.689 04:46:48 -- scripts/common.sh@394 -- # return 1 00:03:13.689 04:46:48 -- spdk/autotest.sh@112 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:13.689 1+0 records in 00:03:13.689 1+0 records out 00:03:13.689 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00499657 s, 210 MB/s 00:03:13.689 04:46:48 -- spdk/autotest.sh@116 -- # sync 00:03:13.689 04:46:48 -- spdk/autotest.sh@118 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:13.689 04:46:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:13.689 04:46:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:21.816 04:46:55 -- spdk/autotest.sh@122 -- # uname -s 00:03:21.816 04:46:55 -- spdk/autotest.sh@122 -- # '[' Linux = Linux ']' 00:03:21.816 04:46:55 -- spdk/autotest.sh@123 -- # run_test setup.sh /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:21.816 04:46:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:21.816 04:46:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:21.816 04:46:55 -- common/autotest_common.sh@10 -- # set +x 00:03:21.816 ************************************ 00:03:21.816 START TEST setup.sh 00:03:21.816 ************************************ 00:03:21.816 04:46:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/test-setup.sh 00:03:21.816 * Looking for test storage... 00:03:21.816 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:21.816 04:46:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:21.816 04:46:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:21.816 04:46:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:21.816 04:46:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:21.816 04:46:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:21.816 04:46:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:21.816 04:46:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:21.816 04:46:55 -- scripts/common.sh@335 -- # IFS=.-: 00:03:21.816 04:46:55 -- scripts/common.sh@335 -- # read -ra ver1 00:03:21.816 04:46:55 -- scripts/common.sh@336 -- # IFS=.-: 00:03:21.816 04:46:55 -- scripts/common.sh@336 -- # read -ra ver2 00:03:21.816 04:46:55 -- scripts/common.sh@337 -- # local 'op=<' 00:03:21.816 04:46:55 -- scripts/common.sh@339 -- # ver1_l=2 00:03:21.816 04:46:55 -- scripts/common.sh@340 -- # ver2_l=1 00:03:21.816 04:46:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:21.816 04:46:55 -- scripts/common.sh@343 -- # case "$op" in 00:03:21.816 04:46:55 -- scripts/common.sh@344 -- # : 1 00:03:21.816 04:46:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:21.816 04:46:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:21.816 04:46:55 -- scripts/common.sh@364 -- # decimal 1 00:03:21.816 04:46:55 -- scripts/common.sh@352 -- # local d=1 00:03:21.816 04:46:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:21.816 04:46:55 -- scripts/common.sh@354 -- # echo 1 00:03:21.816 04:46:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:21.816 04:46:55 -- scripts/common.sh@365 -- # decimal 2 00:03:21.816 04:46:55 -- scripts/common.sh@352 -- # local d=2 00:03:21.816 04:46:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:21.816 04:46:55 -- scripts/common.sh@354 -- # echo 2 00:03:21.816 04:46:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:21.816 04:46:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:21.816 04:46:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:21.816 04:46:55 -- scripts/common.sh@367 -- # return 0 00:03:21.816 04:46:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:21.816 04:46:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:21.816 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.816 --rc genhtml_branch_coverage=1 00:03:21.816 --rc genhtml_function_coverage=1 00:03:21.816 --rc genhtml_legend=1 00:03:21.816 --rc geninfo_all_blocks=1 00:03:21.816 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- setup/test-setup.sh@10 -- # uname -s 00:03:21.817 04:46:55 -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:21.817 04:46:55 -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:21.817 04:46:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:21.817 04:46:55 -- common/autotest_common.sh@10 -- # set +x 00:03:21.817 ************************************ 00:03:21.817 START TEST acl 00:03:21.817 ************************************ 00:03:21.817 04:46:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/acl.sh 00:03:21.817 * Looking for test storage... 00:03:21.817 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:21.817 04:46:55 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:21.817 04:46:55 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:21.817 04:46:55 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:21.817 04:46:55 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:21.817 04:46:55 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:21.817 04:46:55 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:21.817 04:46:55 -- scripts/common.sh@335 -- # IFS=.-: 00:03:21.817 04:46:55 -- scripts/common.sh@335 -- # read -ra ver1 00:03:21.817 04:46:55 -- scripts/common.sh@336 -- # IFS=.-: 00:03:21.817 04:46:55 -- scripts/common.sh@336 -- # read -ra ver2 00:03:21.817 04:46:55 -- scripts/common.sh@337 -- # local 'op=<' 00:03:21.817 04:46:55 -- scripts/common.sh@339 -- # ver1_l=2 00:03:21.817 04:46:55 -- scripts/common.sh@340 -- # ver2_l=1 00:03:21.817 04:46:55 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:21.817 04:46:55 -- scripts/common.sh@343 -- # case "$op" in 00:03:21.817 04:46:55 -- scripts/common.sh@344 -- # : 1 00:03:21.817 04:46:55 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:21.817 04:46:55 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:21.817 04:46:55 -- scripts/common.sh@364 -- # decimal 1 00:03:21.817 04:46:55 -- scripts/common.sh@352 -- # local d=1 00:03:21.817 04:46:55 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:21.817 04:46:55 -- scripts/common.sh@354 -- # echo 1 00:03:21.817 04:46:55 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:21.817 04:46:55 -- scripts/common.sh@365 -- # decimal 2 00:03:21.817 04:46:55 -- scripts/common.sh@352 -- # local d=2 00:03:21.817 04:46:55 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:21.817 04:46:55 -- scripts/common.sh@354 -- # echo 2 00:03:21.817 04:46:55 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:21.817 04:46:55 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:21.817 04:46:55 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:21.817 04:46:55 -- scripts/common.sh@367 -- # return 0 00:03:21.817 04:46:55 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:21.817 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:21.817 --rc genhtml_branch_coverage=1 00:03:21.817 --rc genhtml_function_coverage=1 00:03:21.817 --rc genhtml_legend=1 00:03:21.817 --rc geninfo_all_blocks=1 00:03:21.817 --rc geninfo_unexecuted_blocks=1 00:03:21.817 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:21.817 ' 00:03:21.817 04:46:55 -- setup/acl.sh@10 -- # get_zoned_devs 00:03:21.817 04:46:55 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:03:21.817 04:46:55 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:03:21.817 04:46:55 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:03:21.817 04:46:55 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:03:21.817 04:46:55 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:03:21.817 04:46:55 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:03:21.817 04:46:55 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:21.817 04:46:55 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:03:21.817 04:46:55 -- setup/acl.sh@12 -- # devs=() 00:03:21.817 04:46:55 -- setup/acl.sh@12 -- # declare -a devs 00:03:21.817 04:46:55 -- setup/acl.sh@13 -- # drivers=() 00:03:21.817 04:46:55 -- setup/acl.sh@13 -- # declare -A drivers 00:03:21.817 04:46:55 -- setup/acl.sh@51 -- # setup reset 00:03:21.817 04:46:55 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:21.817 04:46:55 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:25.109 04:46:59 -- setup/acl.sh@52 -- # collect_setup_devs 00:03:25.109 04:46:59 -- setup/acl.sh@16 -- # local dev driver 00:03:25.109 04:46:59 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:25.109 04:46:59 -- setup/acl.sh@15 -- # setup output status 00:03:25.109 04:46:59 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:25.109 04:46:59 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:03:28.400 Hugepages 00:03:28.400 node hugesize free / total 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 00:03:28.400 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:02 -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:28.400 04:47:02 -- setup/acl.sh@20 -- # continue 00:03:28.400 04:47:02 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:03 -- setup/acl.sh@19 -- # [[ 0000:d8:00.0 == *:*:*.* ]] 00:03:28.400 04:47:03 -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:28.400 04:47:03 -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:03:28.400 04:47:03 -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:28.400 04:47:03 -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:28.400 04:47:03 -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:28.400 04:47:03 -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:28.400 04:47:03 -- setup/acl.sh@54 -- # run_test denied denied 00:03:28.400 04:47:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:28.400 04:47:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:28.400 04:47:03 -- common/autotest_common.sh@10 -- # set +x 00:03:28.400 ************************************ 00:03:28.400 START TEST denied 00:03:28.400 ************************************ 00:03:28.400 04:47:03 -- common/autotest_common.sh@1114 -- # denied 00:03:28.400 04:47:03 -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:d8:00.0' 00:03:28.400 04:47:03 -- setup/acl.sh@38 -- # setup output config 00:03:28.400 04:47:03 -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:d8:00.0' 00:03:28.400 04:47:03 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:28.400 04:47:03 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:31.693 0000:d8:00.0 (8086 0a54): Skipping denied controller at 0000:d8:00.0 00:03:31.693 04:47:06 -- setup/acl.sh@40 -- # verify 0000:d8:00.0 00:03:31.693 04:47:06 -- setup/acl.sh@28 -- # local dev driver 00:03:31.693 04:47:06 -- setup/acl.sh@30 -- # for dev in "$@" 00:03:31.693 04:47:06 -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:d8:00.0 ]] 00:03:31.693 04:47:06 -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:d8:00.0/driver 00:03:31.693 04:47:06 -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:31.693 04:47:06 -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:31.693 04:47:06 -- setup/acl.sh@41 -- # setup reset 00:03:31.693 04:47:06 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:31.693 04:47:06 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:36.969 00:03:36.969 real 0m7.940s 00:03:36.969 user 0m2.472s 00:03:36.969 sys 0m4.814s 00:03:36.969 04:47:11 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:36.969 04:47:11 -- common/autotest_common.sh@10 -- # set +x 00:03:36.969 ************************************ 00:03:36.970 END TEST denied 00:03:36.970 ************************************ 00:03:36.970 04:47:11 -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:36.970 04:47:11 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:36.970 04:47:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:36.970 04:47:11 -- common/autotest_common.sh@10 -- # set +x 00:03:36.970 ************************************ 00:03:36.970 START TEST allowed 00:03:36.970 ************************************ 00:03:36.970 04:47:11 -- common/autotest_common.sh@1114 -- # allowed 00:03:36.970 04:47:11 -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:d8:00.0 00:03:36.970 04:47:11 -- setup/acl.sh@45 -- # setup output config 00:03:36.970 04:47:11 -- setup/acl.sh@46 -- # grep -E '0000:d8:00.0 .*: nvme -> .*' 00:03:36.970 04:47:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:36.970 04:47:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:03:41.163 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:41.163 04:47:16 -- setup/acl.sh@47 -- # verify 00:03:41.163 04:47:16 -- setup/acl.sh@28 -- # local dev driver 00:03:41.163 04:47:16 -- setup/acl.sh@48 -- # setup reset 00:03:41.163 04:47:16 -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:41.163 04:47:16 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:03:45.365 00:03:45.365 real 0m8.614s 00:03:45.365 user 0m2.313s 00:03:45.365 sys 0m4.774s 00:03:45.365 04:47:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:45.365 04:47:19 -- common/autotest_common.sh@10 -- # set +x 00:03:45.365 ************************************ 00:03:45.365 END TEST allowed 00:03:45.365 ************************************ 00:03:45.365 00:03:45.365 real 0m24.044s 00:03:45.365 user 0m7.512s 00:03:45.365 sys 0m14.650s 00:03:45.365 04:47:19 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:45.365 04:47:19 -- common/autotest_common.sh@10 -- # set +x 00:03:45.365 ************************************ 00:03:45.365 END TEST acl 00:03:45.365 ************************************ 00:03:45.365 04:47:19 -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:45.365 04:47:19 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:45.365 04:47:19 -- common/autotest_common.sh@10 -- # set +x 00:03:45.365 ************************************ 00:03:45.365 START TEST hugepages 00:03:45.365 ************************************ 00:03:45.365 04:47:19 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/hugepages.sh 00:03:45.365 * Looking for test storage... 00:03:45.365 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:03:45.365 04:47:19 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:03:45.365 04:47:19 -- common/autotest_common.sh@1690 -- # lcov --version 00:03:45.365 04:47:19 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:03:45.365 04:47:19 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:03:45.365 04:47:19 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:03:45.365 04:47:19 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:03:45.365 04:47:19 -- scripts/common.sh@335 -- # IFS=.-: 00:03:45.365 04:47:19 -- scripts/common.sh@335 -- # read -ra ver1 00:03:45.365 04:47:19 -- scripts/common.sh@336 -- # IFS=.-: 00:03:45.365 04:47:19 -- scripts/common.sh@336 -- # read -ra ver2 00:03:45.365 04:47:19 -- scripts/common.sh@337 -- # local 'op=<' 00:03:45.365 04:47:19 -- scripts/common.sh@339 -- # ver1_l=2 00:03:45.365 04:47:19 -- scripts/common.sh@340 -- # ver2_l=1 00:03:45.365 04:47:19 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:03:45.365 04:47:19 -- scripts/common.sh@343 -- # case "$op" in 00:03:45.365 04:47:19 -- scripts/common.sh@344 -- # : 1 00:03:45.365 04:47:19 -- scripts/common.sh@363 -- # (( v = 0 )) 00:03:45.365 04:47:19 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:03:45.365 04:47:19 -- scripts/common.sh@364 -- # decimal 1 00:03:45.365 04:47:19 -- scripts/common.sh@352 -- # local d=1 00:03:45.365 04:47:19 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:03:45.365 04:47:19 -- scripts/common.sh@354 -- # echo 1 00:03:45.365 04:47:19 -- scripts/common.sh@364 -- # ver1[v]=1 00:03:45.365 04:47:19 -- scripts/common.sh@365 -- # decimal 2 00:03:45.365 04:47:19 -- scripts/common.sh@352 -- # local d=2 00:03:45.365 04:47:19 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:03:45.365 04:47:19 -- scripts/common.sh@354 -- # echo 2 00:03:45.365 04:47:19 -- scripts/common.sh@365 -- # ver2[v]=2 00:03:45.365 04:47:19 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:03:45.365 04:47:19 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:03:45.365 04:47:19 -- scripts/common.sh@367 -- # return 0 00:03:45.365 04:47:19 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:03:45.365 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.365 --rc genhtml_branch_coverage=1 00:03:45.365 --rc genhtml_function_coverage=1 00:03:45.365 --rc genhtml_legend=1 00:03:45.365 --rc geninfo_all_blocks=1 00:03:45.365 --rc geninfo_unexecuted_blocks=1 00:03:45.365 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:45.365 ' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:03:45.365 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.365 --rc genhtml_branch_coverage=1 00:03:45.365 --rc genhtml_function_coverage=1 00:03:45.365 --rc genhtml_legend=1 00:03:45.365 --rc geninfo_all_blocks=1 00:03:45.365 --rc geninfo_unexecuted_blocks=1 00:03:45.365 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:45.365 ' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:03:45.365 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.365 --rc genhtml_branch_coverage=1 00:03:45.365 --rc genhtml_function_coverage=1 00:03:45.365 --rc genhtml_legend=1 00:03:45.365 --rc geninfo_all_blocks=1 00:03:45.365 --rc geninfo_unexecuted_blocks=1 00:03:45.365 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:45.365 ' 00:03:45.365 04:47:19 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:03:45.365 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:03:45.365 --rc genhtml_branch_coverage=1 00:03:45.365 --rc genhtml_function_coverage=1 00:03:45.365 --rc genhtml_legend=1 00:03:45.365 --rc geninfo_all_blocks=1 00:03:45.365 --rc geninfo_unexecuted_blocks=1 00:03:45.365 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:03:45.365 ' 00:03:45.365 04:47:19 -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:45.365 04:47:19 -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:45.365 04:47:19 -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:45.365 04:47:19 -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:45.365 04:47:19 -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:45.365 04:47:19 -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:45.365 04:47:19 -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:45.365 04:47:19 -- setup/common.sh@18 -- # local node= 00:03:45.365 04:47:19 -- setup/common.sh@19 -- # local var val 00:03:45.365 04:47:19 -- setup/common.sh@20 -- # local mem_f mem 00:03:45.365 04:47:19 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:45.365 04:47:19 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:45.365 04:47:19 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:45.365 04:47:19 -- setup/common.sh@28 -- # mapfile -t mem 00:03:45.365 04:47:19 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:45.365 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.365 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 42030268 kB' 'MemAvailable: 43778912 kB' 'Buffers: 7364 kB' 'Cached: 9467688 kB' 'SwapCached: 32 kB' 'Active: 8527600 kB' 'Inactive: 1466156 kB' 'Active(anon): 8031516 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522004 kB' 'Mapped: 188896 kB' 'Shmem: 7530568 kB' 'KReclaimable: 476372 kB' 'Slab: 1316564 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 840192 kB' 'KernelStack: 21840 kB' 'PageTables: 8140 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36433336 kB' 'Committed_AS: 9219732 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214224 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:19 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:19 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.366 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.366 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # continue 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # IFS=': ' 00:03:45.367 04:47:20 -- setup/common.sh@31 -- # read -r var val _ 00:03:45.367 04:47:20 -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:45.367 04:47:20 -- setup/common.sh@33 -- # echo 2048 00:03:45.367 04:47:20 -- setup/common.sh@33 -- # return 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:45.367 04:47:20 -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:45.367 04:47:20 -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:45.367 04:47:20 -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:45.367 04:47:20 -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:45.367 04:47:20 -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:45.367 04:47:20 -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:45.367 04:47:20 -- setup/hugepages.sh@207 -- # get_nodes 00:03:45.367 04:47:20 -- setup/hugepages.sh@27 -- # local node 00:03:45.367 04:47:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.367 04:47:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=2048 00:03:45.367 04:47:20 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:45.367 04:47:20 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:45.367 04:47:20 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:45.367 04:47:20 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:45.367 04:47:20 -- setup/hugepages.sh@208 -- # clear_hp 00:03:45.367 04:47:20 -- setup/hugepages.sh@37 -- # local node hp 00:03:45.367 04:47:20 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:45.367 04:47:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:45.367 04:47:20 -- setup/hugepages.sh@41 -- # echo 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:45.367 04:47:20 -- setup/hugepages.sh@41 -- # echo 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:45.367 04:47:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:45.367 04:47:20 -- setup/hugepages.sh@41 -- # echo 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:45.367 04:47:20 -- setup/hugepages.sh@41 -- # echo 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:45.367 04:47:20 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:45.367 04:47:20 -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:45.367 04:47:20 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:45.367 04:47:20 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:45.367 04:47:20 -- common/autotest_common.sh@10 -- # set +x 00:03:45.367 ************************************ 00:03:45.367 START TEST default_setup 00:03:45.367 ************************************ 00:03:45.367 04:47:20 -- common/autotest_common.sh@1114 -- # default_setup 00:03:45.367 04:47:20 -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:45.367 04:47:20 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:45.368 04:47:20 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:45.368 04:47:20 -- setup/hugepages.sh@51 -- # shift 00:03:45.368 04:47:20 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:45.368 04:47:20 -- setup/hugepages.sh@52 -- # local node_ids 00:03:45.368 04:47:20 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:45.368 04:47:20 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:45.368 04:47:20 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:45.368 04:47:20 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:45.368 04:47:20 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:45.368 04:47:20 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:45.368 04:47:20 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:45.368 04:47:20 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:45.368 04:47:20 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:45.368 04:47:20 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:45.368 04:47:20 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:45.368 04:47:20 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:45.368 04:47:20 -- setup/hugepages.sh@73 -- # return 0 00:03:45.368 04:47:20 -- setup/hugepages.sh@137 -- # setup output 00:03:45.368 04:47:20 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:45.368 04:47:20 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:48.657 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:48.657 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:50.567 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:03:50.567 04:47:25 -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:50.567 04:47:25 -- setup/hugepages.sh@89 -- # local node 00:03:50.567 04:47:25 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:50.567 04:47:25 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:50.567 04:47:25 -- setup/hugepages.sh@92 -- # local surp 00:03:50.567 04:47:25 -- setup/hugepages.sh@93 -- # local resv 00:03:50.567 04:47:25 -- setup/hugepages.sh@94 -- # local anon 00:03:50.567 04:47:25 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:50.567 04:47:25 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:50.567 04:47:25 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:50.567 04:47:25 -- setup/common.sh@18 -- # local node= 00:03:50.567 04:47:25 -- setup/common.sh@19 -- # local var val 00:03:50.567 04:47:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.567 04:47:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.567 04:47:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.567 04:47:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.567 04:47:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.567 04:47:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.567 04:47:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44230440 kB' 'MemAvailable: 45979084 kB' 'Buffers: 7364 kB' 'Cached: 9467820 kB' 'SwapCached: 32 kB' 'Active: 8528700 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032616 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523016 kB' 'Mapped: 188908 kB' 'Shmem: 7530700 kB' 'KReclaimable: 476372 kB' 'Slab: 1314492 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 838120 kB' 'KernelStack: 22016 kB' 'PageTables: 8564 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9223068 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214176 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.567 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.567 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.568 04:47:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:50.568 04:47:25 -- setup/common.sh@33 -- # echo 0 00:03:50.568 04:47:25 -- setup/common.sh@33 -- # return 0 00:03:50.568 04:47:25 -- setup/hugepages.sh@97 -- # anon=0 00:03:50.568 04:47:25 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:50.568 04:47:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.568 04:47:25 -- setup/common.sh@18 -- # local node= 00:03:50.568 04:47:25 -- setup/common.sh@19 -- # local var val 00:03:50.568 04:47:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.568 04:47:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.568 04:47:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.568 04:47:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.568 04:47:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.568 04:47:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.568 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44231744 kB' 'MemAvailable: 45980388 kB' 'Buffers: 7364 kB' 'Cached: 9467828 kB' 'SwapCached: 32 kB' 'Active: 8528308 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032224 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522624 kB' 'Mapped: 188904 kB' 'Shmem: 7530708 kB' 'KReclaimable: 476372 kB' 'Slab: 1314524 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 838152 kB' 'KernelStack: 21904 kB' 'PageTables: 8080 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9223216 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214208 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.569 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.569 04:47:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.570 04:47:25 -- setup/common.sh@33 -- # echo 0 00:03:50.570 04:47:25 -- setup/common.sh@33 -- # return 0 00:03:50.570 04:47:25 -- setup/hugepages.sh@99 -- # surp=0 00:03:50.570 04:47:25 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:50.570 04:47:25 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:50.570 04:47:25 -- setup/common.sh@18 -- # local node= 00:03:50.570 04:47:25 -- setup/common.sh@19 -- # local var val 00:03:50.570 04:47:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.570 04:47:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.570 04:47:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.570 04:47:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.570 04:47:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.570 04:47:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44232044 kB' 'MemAvailable: 45980688 kB' 'Buffers: 7364 kB' 'Cached: 9467844 kB' 'SwapCached: 32 kB' 'Active: 8528216 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032132 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522504 kB' 'Mapped: 188956 kB' 'Shmem: 7530724 kB' 'KReclaimable: 476372 kB' 'Slab: 1314560 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 838188 kB' 'KernelStack: 21968 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9223232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214256 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.570 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.570 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:50.571 04:47:25 -- setup/common.sh@33 -- # echo 0 00:03:50.571 04:47:25 -- setup/common.sh@33 -- # return 0 00:03:50.571 04:47:25 -- setup/hugepages.sh@100 -- # resv=0 00:03:50.571 04:47:25 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:50.571 nr_hugepages=1024 00:03:50.571 04:47:25 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:50.571 resv_hugepages=0 00:03:50.571 04:47:25 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:50.571 surplus_hugepages=0 00:03:50.571 04:47:25 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:50.571 anon_hugepages=0 00:03:50.571 04:47:25 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:50.571 04:47:25 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:50.571 04:47:25 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:50.571 04:47:25 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:50.571 04:47:25 -- setup/common.sh@18 -- # local node= 00:03:50.571 04:47:25 -- setup/common.sh@19 -- # local var val 00:03:50.571 04:47:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.571 04:47:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.571 04:47:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:50.571 04:47:25 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:50.571 04:47:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.571 04:47:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.571 04:47:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44230672 kB' 'MemAvailable: 45979316 kB' 'Buffers: 7364 kB' 'Cached: 9467864 kB' 'SwapCached: 32 kB' 'Active: 8528772 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032688 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523056 kB' 'Mapped: 188956 kB' 'Shmem: 7530744 kB' 'KReclaimable: 476372 kB' 'Slab: 1314552 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 838180 kB' 'KernelStack: 22000 kB' 'PageTables: 8644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9222104 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.571 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.571 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.572 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.572 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:50.573 04:47:25 -- setup/common.sh@33 -- # echo 1024 00:03:50.573 04:47:25 -- setup/common.sh@33 -- # return 0 00:03:50.573 04:47:25 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:50.573 04:47:25 -- setup/hugepages.sh@112 -- # get_nodes 00:03:50.573 04:47:25 -- setup/hugepages.sh@27 -- # local node 00:03:50.573 04:47:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.573 04:47:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:50.573 04:47:25 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.573 04:47:25 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:50.573 04:47:25 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.573 04:47:25 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.573 04:47:25 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.573 04:47:25 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.573 04:47:25 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:50.573 04:47:25 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.573 04:47:25 -- setup/common.sh@18 -- # local node=0 00:03:50.573 04:47:25 -- setup/common.sh@19 -- # local var val 00:03:50.573 04:47:25 -- setup/common.sh@20 -- # local mem_f mem 00:03:50.573 04:47:25 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.573 04:47:25 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:50.573 04:47:25 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:50.573 04:47:25 -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.573 04:47:25 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.573 04:47:25 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24386136 kB' 'MemUsed: 8199232 kB' 'SwapCached: 32 kB' 'Active: 3636440 kB' 'Inactive: 269840 kB' 'Active(anon): 3255436 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499360 kB' 'Mapped: 145184 kB' 'AnonPages: 410172 kB' 'Shmem: 2848544 kB' 'KernelStack: 12584 kB' 'PageTables: 5404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251412 kB' 'Slab: 650384 kB' 'SReclaimable: 251412 kB' 'SUnreclaim: 398972 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.573 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.573 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # continue 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # IFS=': ' 00:03:50.574 04:47:25 -- setup/common.sh@31 -- # read -r var val _ 00:03:50.574 04:47:25 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.574 04:47:25 -- setup/common.sh@33 -- # echo 0 00:03:50.574 04:47:25 -- setup/common.sh@33 -- # return 0 00:03:50.574 04:47:25 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.574 04:47:25 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.574 04:47:25 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.574 04:47:25 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.574 04:47:25 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:50.574 node0=1024 expecting 1024 00:03:50.574 04:47:25 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:50.574 00:03:50.574 real 0m5.397s 00:03:50.574 user 0m1.453s 00:03:50.574 sys 0m2.449s 00:03:50.574 04:47:25 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:50.574 04:47:25 -- common/autotest_common.sh@10 -- # set +x 00:03:50.574 ************************************ 00:03:50.574 END TEST default_setup 00:03:50.574 ************************************ 00:03:50.574 04:47:25 -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:50.574 04:47:25 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:50.574 04:47:25 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:50.574 04:47:25 -- common/autotest_common.sh@10 -- # set +x 00:03:50.574 ************************************ 00:03:50.574 START TEST per_node_1G_alloc 00:03:50.574 ************************************ 00:03:50.574 04:47:25 -- common/autotest_common.sh@1114 -- # per_node_1G_alloc 00:03:50.574 04:47:25 -- setup/hugepages.sh@143 -- # local IFS=, 00:03:50.574 04:47:25 -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:50.574 04:47:25 -- setup/hugepages.sh@49 -- # local size=1048576 00:03:50.574 04:47:25 -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:50.574 04:47:25 -- setup/hugepages.sh@51 -- # shift 00:03:50.574 04:47:25 -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:50.574 04:47:25 -- setup/hugepages.sh@52 -- # local node_ids 00:03:50.574 04:47:25 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.574 04:47:25 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:50.574 04:47:25 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:50.574 04:47:25 -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:50.574 04:47:25 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.574 04:47:25 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:50.574 04:47:25 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.574 04:47:25 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.574 04:47:25 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.574 04:47:25 -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:50.574 04:47:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:50.574 04:47:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:50.574 04:47:25 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:50.574 04:47:25 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:50.574 04:47:25 -- setup/hugepages.sh@73 -- # return 0 00:03:50.574 04:47:25 -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:50.574 04:47:25 -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:50.574 04:47:25 -- setup/hugepages.sh@146 -- # setup output 00:03:50.574 04:47:25 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.574 04:47:25 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:53.949 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.949 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:53.949 04:47:28 -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:53.949 04:47:28 -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:53.949 04:47:28 -- setup/hugepages.sh@89 -- # local node 00:03:53.949 04:47:28 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:53.949 04:47:28 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:53.949 04:47:28 -- setup/hugepages.sh@92 -- # local surp 00:03:53.949 04:47:28 -- setup/hugepages.sh@93 -- # local resv 00:03:53.949 04:47:28 -- setup/hugepages.sh@94 -- # local anon 00:03:53.949 04:47:28 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:53.949 04:47:28 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:53.949 04:47:28 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:53.949 04:47:28 -- setup/common.sh@18 -- # local node= 00:03:53.949 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.949 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.949 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.950 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.950 04:47:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.950 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.950 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44248564 kB' 'MemAvailable: 45997208 kB' 'Buffers: 7364 kB' 'Cached: 9467936 kB' 'SwapCached: 32 kB' 'Active: 8529400 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033316 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523456 kB' 'Mapped: 187936 kB' 'Shmem: 7530816 kB' 'KReclaimable: 476372 kB' 'Slab: 1314212 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 837840 kB' 'KernelStack: 21920 kB' 'PageTables: 8316 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9218220 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214624 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.950 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.950 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.951 04:47:28 -- setup/common.sh@33 -- # echo 0 00:03:53.951 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.951 04:47:28 -- setup/hugepages.sh@97 -- # anon=0 00:03:53.951 04:47:28 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:53.951 04:47:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.951 04:47:28 -- setup/common.sh@18 -- # local node= 00:03:53.951 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.951 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.951 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.951 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.951 04:47:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.951 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.951 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44248036 kB' 'MemAvailable: 45996680 kB' 'Buffers: 7364 kB' 'Cached: 9467940 kB' 'SwapCached: 32 kB' 'Active: 8529004 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032920 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523136 kB' 'Mapped: 187924 kB' 'Shmem: 7530820 kB' 'KReclaimable: 476372 kB' 'Slab: 1314236 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 837864 kB' 'KernelStack: 22064 kB' 'PageTables: 8652 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9218232 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214544 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.951 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.951 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.952 04:47:28 -- setup/common.sh@33 -- # echo 0 00:03:53.952 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.952 04:47:28 -- setup/hugepages.sh@99 -- # surp=0 00:03:53.952 04:47:28 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:53.952 04:47:28 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:53.952 04:47:28 -- setup/common.sh@18 -- # local node= 00:03:53.952 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.952 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.952 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.952 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.952 04:47:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.952 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.952 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44247196 kB' 'MemAvailable: 45995840 kB' 'Buffers: 7364 kB' 'Cached: 9467952 kB' 'SwapCached: 32 kB' 'Active: 8528584 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032500 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522692 kB' 'Mapped: 187924 kB' 'Shmem: 7530832 kB' 'KReclaimable: 476372 kB' 'Slab: 1314236 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 837864 kB' 'KernelStack: 21936 kB' 'PageTables: 8100 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9218248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214560 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.952 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.952 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.953 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.953 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.954 04:47:28 -- setup/common.sh@33 -- # echo 0 00:03:53.954 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.954 04:47:28 -- setup/hugepages.sh@100 -- # resv=0 00:03:53.954 04:47:28 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:53.954 nr_hugepages=1024 00:03:53.954 04:47:28 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:53.954 resv_hugepages=0 00:03:53.954 04:47:28 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:53.954 surplus_hugepages=0 00:03:53.954 04:47:28 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:53.954 anon_hugepages=0 00:03:53.954 04:47:28 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:53.954 04:47:28 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:53.954 04:47:28 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:53.954 04:47:28 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:53.954 04:47:28 -- setup/common.sh@18 -- # local node= 00:03:53.954 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.954 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.954 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.954 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.954 04:47:28 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.954 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.954 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44246012 kB' 'MemAvailable: 45994656 kB' 'Buffers: 7364 kB' 'Cached: 9467964 kB' 'SwapCached: 32 kB' 'Active: 8528868 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032784 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522944 kB' 'Mapped: 187924 kB' 'Shmem: 7530844 kB' 'KReclaimable: 476372 kB' 'Slab: 1314236 kB' 'SReclaimable: 476372 kB' 'SUnreclaim: 837864 kB' 'KernelStack: 21888 kB' 'PageTables: 8272 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9218260 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214528 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.954 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.954 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.955 04:47:28 -- setup/common.sh@33 -- # echo 1024 00:03:53.955 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.955 04:47:28 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:53.955 04:47:28 -- setup/hugepages.sh@112 -- # get_nodes 00:03:53.955 04:47:28 -- setup/hugepages.sh@27 -- # local node 00:03:53.955 04:47:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.955 04:47:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:53.955 04:47:28 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.955 04:47:28 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:53.955 04:47:28 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:53.955 04:47:28 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:53.955 04:47:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.955 04:47:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.955 04:47:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:53.955 04:47:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.955 04:47:28 -- setup/common.sh@18 -- # local node=0 00:03:53.955 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.955 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.955 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.955 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:53.955 04:47:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:53.955 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.955 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25445376 kB' 'MemUsed: 7139992 kB' 'SwapCached: 32 kB' 'Active: 3637728 kB' 'Inactive: 269840 kB' 'Active(anon): 3256724 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499476 kB' 'Mapped: 144220 kB' 'AnonPages: 411268 kB' 'Shmem: 2848660 kB' 'KernelStack: 12568 kB' 'PageTables: 5404 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251412 kB' 'Slab: 650156 kB' 'SReclaimable: 251412 kB' 'SUnreclaim: 398744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.955 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.955 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@33 -- # echo 0 00:03:53.956 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.956 04:47:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.956 04:47:28 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.956 04:47:28 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.956 04:47:28 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:53.956 04:47:28 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.956 04:47:28 -- setup/common.sh@18 -- # local node=1 00:03:53.956 04:47:28 -- setup/common.sh@19 -- # local var val 00:03:53.956 04:47:28 -- setup/common.sh@20 -- # local mem_f mem 00:03:53.956 04:47:28 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.956 04:47:28 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:53.956 04:47:28 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:53.956 04:47:28 -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.956 04:47:28 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 18802032 kB' 'MemUsed: 8896368 kB' 'SwapCached: 0 kB' 'Active: 4890876 kB' 'Inactive: 1196316 kB' 'Active(anon): 4775796 kB' 'Inactive(anon): 17696 kB' 'Active(file): 115080 kB' 'Inactive(file): 1178620 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5975900 kB' 'Mapped: 43704 kB' 'AnonPages: 111384 kB' 'Shmem: 4682200 kB' 'KernelStack: 9256 kB' 'PageTables: 2512 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224960 kB' 'Slab: 664080 kB' 'SReclaimable: 224960 kB' 'SUnreclaim: 439120 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.956 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.956 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # continue 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # IFS=': ' 00:03:53.957 04:47:28 -- setup/common.sh@31 -- # read -r var val _ 00:03:53.957 04:47:28 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.957 04:47:28 -- setup/common.sh@33 -- # echo 0 00:03:53.957 04:47:28 -- setup/common.sh@33 -- # return 0 00:03:53.957 04:47:28 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.957 04:47:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.957 04:47:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.957 04:47:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.957 04:47:28 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:53.957 node0=512 expecting 512 00:03:53.957 04:47:28 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.957 04:47:28 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.957 04:47:28 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.957 04:47:28 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:53.957 node1=512 expecting 512 00:03:53.957 04:47:28 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:53.957 00:03:53.957 real 0m3.246s 00:03:53.957 user 0m1.149s 00:03:53.957 sys 0m2.045s 00:03:53.957 04:47:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:53.957 04:47:28 -- common/autotest_common.sh@10 -- # set +x 00:03:53.957 ************************************ 00:03:53.957 END TEST per_node_1G_alloc 00:03:53.957 ************************************ 00:03:53.957 04:47:28 -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:53.957 04:47:28 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:53.957 04:47:28 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:53.957 04:47:28 -- common/autotest_common.sh@10 -- # set +x 00:03:53.957 ************************************ 00:03:53.957 START TEST even_2G_alloc 00:03:53.957 ************************************ 00:03:53.957 04:47:28 -- common/autotest_common.sh@1114 -- # even_2G_alloc 00:03:53.957 04:47:28 -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:53.957 04:47:28 -- setup/hugepages.sh@49 -- # local size=2097152 00:03:53.957 04:47:28 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:53.957 04:47:28 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.957 04:47:28 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:53.957 04:47:28 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:53.957 04:47:28 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:53.957 04:47:28 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.957 04:47:28 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.957 04:47:28 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.957 04:47:28 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.957 04:47:28 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.957 04:47:28 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:53.958 04:47:28 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:53.958 04:47:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.958 04:47:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:53.958 04:47:28 -- setup/hugepages.sh@83 -- # : 512 00:03:53.958 04:47:28 -- setup/hugepages.sh@84 -- # : 1 00:03:53.958 04:47:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.958 04:47:28 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:53.958 04:47:28 -- setup/hugepages.sh@83 -- # : 0 00:03:53.958 04:47:28 -- setup/hugepages.sh@84 -- # : 0 00:03:53.958 04:47:28 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:53.958 04:47:28 -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:53.958 04:47:28 -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:53.958 04:47:28 -- setup/hugepages.sh@153 -- # setup output 00:03:53.958 04:47:28 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.958 04:47:28 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:03:57.250 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:57.250 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:57.250 04:47:32 -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:57.250 04:47:32 -- setup/hugepages.sh@89 -- # local node 00:03:57.250 04:47:32 -- setup/hugepages.sh@90 -- # local sorted_t 00:03:57.250 04:47:32 -- setup/hugepages.sh@91 -- # local sorted_s 00:03:57.250 04:47:32 -- setup/hugepages.sh@92 -- # local surp 00:03:57.250 04:47:32 -- setup/hugepages.sh@93 -- # local resv 00:03:57.250 04:47:32 -- setup/hugepages.sh@94 -- # local anon 00:03:57.250 04:47:32 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:57.250 04:47:32 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:57.250 04:47:32 -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:57.250 04:47:32 -- setup/common.sh@18 -- # local node= 00:03:57.250 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.250 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.250 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.250 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.250 04:47:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.250 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.250 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44232552 kB' 'MemAvailable: 45981164 kB' 'Buffers: 7364 kB' 'Cached: 9468068 kB' 'SwapCached: 32 kB' 'Active: 8527412 kB' 'Inactive: 1466156 kB' 'Active(anon): 8031328 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521412 kB' 'Mapped: 187956 kB' 'Shmem: 7530948 kB' 'KReclaimable: 476340 kB' 'Slab: 1314776 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838436 kB' 'KernelStack: 21856 kB' 'PageTables: 8116 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9214332 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214464 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.250 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.250 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:57.251 04:47:32 -- setup/common.sh@33 -- # echo 0 00:03:57.251 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.251 04:47:32 -- setup/hugepages.sh@97 -- # anon=0 00:03:57.251 04:47:32 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:57.251 04:47:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.251 04:47:32 -- setup/common.sh@18 -- # local node= 00:03:57.251 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.251 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.251 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.251 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.251 04:47:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.251 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.251 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44241860 kB' 'MemAvailable: 45990472 kB' 'Buffers: 7364 kB' 'Cached: 9468072 kB' 'SwapCached: 32 kB' 'Active: 8527932 kB' 'Inactive: 1466156 kB' 'Active(anon): 8031848 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521932 kB' 'Mapped: 188008 kB' 'Shmem: 7530952 kB' 'KReclaimable: 476340 kB' 'Slab: 1314840 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838500 kB' 'KernelStack: 21872 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9213976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214400 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.251 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.251 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.252 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.252 04:47:32 -- setup/common.sh@33 -- # echo 0 00:03:57.252 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.252 04:47:32 -- setup/hugepages.sh@99 -- # surp=0 00:03:57.252 04:47:32 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:57.252 04:47:32 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:57.252 04:47:32 -- setup/common.sh@18 -- # local node= 00:03:57.252 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.252 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.252 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.252 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.252 04:47:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.252 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.252 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.252 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44241680 kB' 'MemAvailable: 45990292 kB' 'Buffers: 7364 kB' 'Cached: 9468072 kB' 'SwapCached: 32 kB' 'Active: 8527484 kB' 'Inactive: 1466156 kB' 'Active(anon): 8031400 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521444 kB' 'Mapped: 187928 kB' 'Shmem: 7530952 kB' 'KReclaimable: 476340 kB' 'Slab: 1314808 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838468 kB' 'KernelStack: 21792 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9213992 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.253 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.253 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:57.254 04:47:32 -- setup/common.sh@33 -- # echo 0 00:03:57.254 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.254 04:47:32 -- setup/hugepages.sh@100 -- # resv=0 00:03:57.254 04:47:32 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:57.254 nr_hugepages=1024 00:03:57.254 04:47:32 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:57.254 resv_hugepages=0 00:03:57.254 04:47:32 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:57.254 surplus_hugepages=0 00:03:57.254 04:47:32 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:57.254 anon_hugepages=0 00:03:57.254 04:47:32 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.254 04:47:32 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:57.254 04:47:32 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:57.254 04:47:32 -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:57.254 04:47:32 -- setup/common.sh@18 -- # local node= 00:03:57.254 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.254 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.254 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.254 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:57.254 04:47:32 -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:57.254 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.254 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44242188 kB' 'MemAvailable: 45990800 kB' 'Buffers: 7364 kB' 'Cached: 9468100 kB' 'SwapCached: 32 kB' 'Active: 8527168 kB' 'Inactive: 1466156 kB' 'Active(anon): 8031084 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521092 kB' 'Mapped: 187928 kB' 'Shmem: 7530980 kB' 'KReclaimable: 476340 kB' 'Slab: 1314808 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838468 kB' 'KernelStack: 21776 kB' 'PageTables: 7852 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9214144 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.254 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.254 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.255 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.255 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:57.255 04:47:32 -- setup/common.sh@33 -- # echo 1024 00:03:57.255 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.255 04:47:32 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:57.256 04:47:32 -- setup/hugepages.sh@112 -- # get_nodes 00:03:57.256 04:47:32 -- setup/hugepages.sh@27 -- # local node 00:03:57.256 04:47:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.256 04:47:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:57.256 04:47:32 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:57.256 04:47:32 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:57.256 04:47:32 -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:57.256 04:47:32 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:57.256 04:47:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.256 04:47:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.256 04:47:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:57.256 04:47:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.256 04:47:32 -- setup/common.sh@18 -- # local node=0 00:03:57.256 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.256 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.256 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.256 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:57.256 04:47:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:57.256 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.256 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25454940 kB' 'MemUsed: 7130428 kB' 'SwapCached: 32 kB' 'Active: 3636072 kB' 'Inactive: 269840 kB' 'Active(anon): 3255068 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499560 kB' 'Mapped: 144224 kB' 'AnonPages: 409528 kB' 'Shmem: 2848744 kB' 'KernelStack: 12552 kB' 'PageTables: 5340 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251380 kB' 'Slab: 650480 kB' 'SReclaimable: 251380 kB' 'SUnreclaim: 399100 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.256 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.256 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@33 -- # echo 0 00:03:57.257 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.257 04:47:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.257 04:47:32 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:57.257 04:47:32 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:57.257 04:47:32 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:57.257 04:47:32 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:57.257 04:47:32 -- setup/common.sh@18 -- # local node=1 00:03:57.257 04:47:32 -- setup/common.sh@19 -- # local var val 00:03:57.257 04:47:32 -- setup/common.sh@20 -- # local mem_f mem 00:03:57.257 04:47:32 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:57.257 04:47:32 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:57.257 04:47:32 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:57.257 04:47:32 -- setup/common.sh@28 -- # mapfile -t mem 00:03:57.257 04:47:32 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:57.257 04:47:32 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 18787248 kB' 'MemUsed: 8911152 kB' 'SwapCached: 0 kB' 'Active: 4891568 kB' 'Inactive: 1196316 kB' 'Active(anon): 4776488 kB' 'Inactive(anon): 17696 kB' 'Active(file): 115080 kB' 'Inactive(file): 1178620 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5975960 kB' 'Mapped: 43704 kB' 'AnonPages: 112032 kB' 'Shmem: 4682260 kB' 'KernelStack: 9256 kB' 'PageTables: 2648 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224960 kB' 'Slab: 664328 kB' 'SReclaimable: 224960 kB' 'SUnreclaim: 439368 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.257 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.257 04:47:32 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # continue 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # IFS=': ' 00:03:57.258 04:47:32 -- setup/common.sh@31 -- # read -r var val _ 00:03:57.258 04:47:32 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:57.258 04:47:32 -- setup/common.sh@33 -- # echo 0 00:03:57.258 04:47:32 -- setup/common.sh@33 -- # return 0 00:03:57.258 04:47:32 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.258 04:47:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.258 04:47:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.258 04:47:32 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:57.258 node0=512 expecting 512 00:03:57.258 04:47:32 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:57.258 04:47:32 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:57.258 04:47:32 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:57.258 04:47:32 -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:57.258 node1=512 expecting 512 00:03:57.258 04:47:32 -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:57.258 00:03:57.258 real 0m3.501s 00:03:57.258 user 0m1.373s 00:03:57.258 sys 0m2.184s 00:03:57.258 04:47:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:03:57.258 04:47:32 -- common/autotest_common.sh@10 -- # set +x 00:03:57.258 ************************************ 00:03:57.258 END TEST even_2G_alloc 00:03:57.258 ************************************ 00:03:57.258 04:47:32 -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:57.258 04:47:32 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:03:57.258 04:47:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:03:57.258 04:47:32 -- common/autotest_common.sh@10 -- # set +x 00:03:57.258 ************************************ 00:03:57.258 START TEST odd_alloc 00:03:57.258 ************************************ 00:03:57.258 04:47:32 -- common/autotest_common.sh@1114 -- # odd_alloc 00:03:57.258 04:47:32 -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:57.258 04:47:32 -- setup/hugepages.sh@49 -- # local size=2098176 00:03:57.258 04:47:32 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:57.258 04:47:32 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:57.258 04:47:32 -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:57.258 04:47:32 -- setup/hugepages.sh@62 -- # local user_nodes 00:03:57.258 04:47:32 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:57.258 04:47:32 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:57.258 04:47:32 -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:57.258 04:47:32 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:57.258 04:47:32 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:57.258 04:47:32 -- setup/hugepages.sh@83 -- # : 513 00:03:57.258 04:47:32 -- setup/hugepages.sh@84 -- # : 1 00:03:57.258 04:47:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:57.258 04:47:32 -- setup/hugepages.sh@83 -- # : 0 00:03:57.258 04:47:32 -- setup/hugepages.sh@84 -- # : 0 00:03:57.258 04:47:32 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:57.258 04:47:32 -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:57.258 04:47:32 -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:57.258 04:47:32 -- setup/hugepages.sh@160 -- # setup output 00:03:57.258 04:47:32 -- setup/common.sh@9 -- # [[ output == output ]] 00:03:57.258 04:47:32 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:00.547 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:00.547 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:00.810 04:47:35 -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:00.810 04:47:35 -- setup/hugepages.sh@89 -- # local node 00:04:00.810 04:47:35 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:00.810 04:47:35 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:00.810 04:47:35 -- setup/hugepages.sh@92 -- # local surp 00:04:00.810 04:47:35 -- setup/hugepages.sh@93 -- # local resv 00:04:00.810 04:47:35 -- setup/hugepages.sh@94 -- # local anon 00:04:00.810 04:47:35 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:00.810 04:47:35 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:00.810 04:47:35 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:00.810 04:47:35 -- setup/common.sh@18 -- # local node= 00:04:00.810 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.810 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.810 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.810 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.810 04:47:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.810 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.810 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44261408 kB' 'MemAvailable: 46010020 kB' 'Buffers: 7364 kB' 'Cached: 9468204 kB' 'SwapCached: 32 kB' 'Active: 8528084 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032000 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521412 kB' 'Mapped: 188028 kB' 'Shmem: 7531084 kB' 'KReclaimable: 476340 kB' 'Slab: 1314256 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837916 kB' 'KernelStack: 21760 kB' 'PageTables: 7812 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 9215264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.810 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.810 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:00.811 04:47:35 -- setup/common.sh@33 -- # echo 0 00:04:00.811 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.811 04:47:35 -- setup/hugepages.sh@97 -- # anon=0 00:04:00.811 04:47:35 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:00.811 04:47:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.811 04:47:35 -- setup/common.sh@18 -- # local node= 00:04:00.811 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.811 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.811 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.811 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.811 04:47:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.811 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.811 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44268252 kB' 'MemAvailable: 46016864 kB' 'Buffers: 7364 kB' 'Cached: 9468208 kB' 'SwapCached: 32 kB' 'Active: 8528236 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032152 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522136 kB' 'Mapped: 187932 kB' 'Shmem: 7531088 kB' 'KReclaimable: 476340 kB' 'Slab: 1314228 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837888 kB' 'KernelStack: 21808 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 9215276 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.811 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.811 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.812 04:47:35 -- setup/common.sh@33 -- # echo 0 00:04:00.812 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.812 04:47:35 -- setup/hugepages.sh@99 -- # surp=0 00:04:00.812 04:47:35 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:00.812 04:47:35 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:00.812 04:47:35 -- setup/common.sh@18 -- # local node= 00:04:00.812 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.812 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.812 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.812 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.812 04:47:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.812 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.812 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44268456 kB' 'MemAvailable: 46017068 kB' 'Buffers: 7364 kB' 'Cached: 9468220 kB' 'SwapCached: 32 kB' 'Active: 8528268 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032184 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522132 kB' 'Mapped: 187932 kB' 'Shmem: 7531100 kB' 'KReclaimable: 476340 kB' 'Slab: 1314228 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837888 kB' 'KernelStack: 21808 kB' 'PageTables: 7972 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 9215288 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.812 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.812 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.813 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.813 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:00.814 04:47:35 -- setup/common.sh@33 -- # echo 0 00:04:00.814 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.814 04:47:35 -- setup/hugepages.sh@100 -- # resv=0 00:04:00.814 04:47:35 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:00.814 nr_hugepages=1025 00:04:00.814 04:47:35 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:00.814 resv_hugepages=0 00:04:00.814 04:47:35 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:00.814 surplus_hugepages=0 00:04:00.814 04:47:35 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:00.814 anon_hugepages=0 00:04:00.814 04:47:35 -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:00.814 04:47:35 -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:00.814 04:47:35 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:00.814 04:47:35 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:00.814 04:47:35 -- setup/common.sh@18 -- # local node= 00:04:00.814 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.814 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.814 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.814 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:00.814 04:47:35 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:00.814 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.814 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44268456 kB' 'MemAvailable: 46017068 kB' 'Buffers: 7364 kB' 'Cached: 9468236 kB' 'SwapCached: 32 kB' 'Active: 8528144 kB' 'Inactive: 1466156 kB' 'Active(anon): 8032060 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 521964 kB' 'Mapped: 187932 kB' 'Shmem: 7531116 kB' 'KReclaimable: 476340 kB' 'Slab: 1314228 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837888 kB' 'KernelStack: 21792 kB' 'PageTables: 7924 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37480888 kB' 'Committed_AS: 9215304 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214288 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.814 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.814 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:00.815 04:47:35 -- setup/common.sh@33 -- # echo 1025 00:04:00.815 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.815 04:47:35 -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:00.815 04:47:35 -- setup/hugepages.sh@112 -- # get_nodes 00:04:00.815 04:47:35 -- setup/hugepages.sh@27 -- # local node 00:04:00.815 04:47:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.815 04:47:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:00.815 04:47:35 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:00.815 04:47:35 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:00.815 04:47:35 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:00.815 04:47:35 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:00.815 04:47:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.815 04:47:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.815 04:47:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:00.815 04:47:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.815 04:47:35 -- setup/common.sh@18 -- # local node=0 00:04:00.815 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.815 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.815 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.815 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:00.815 04:47:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:00.815 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.815 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.815 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25461452 kB' 'MemUsed: 7123916 kB' 'SwapCached: 32 kB' 'Active: 3636904 kB' 'Inactive: 269840 kB' 'Active(anon): 3255900 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499636 kB' 'Mapped: 144732 kB' 'AnonPages: 410336 kB' 'Shmem: 2848820 kB' 'KernelStack: 12584 kB' 'PageTables: 5432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251380 kB' 'Slab: 650080 kB' 'SReclaimable: 251380 kB' 'SUnreclaim: 398700 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.815 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.815 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.816 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.816 04:47:35 -- setup/common.sh@33 -- # echo 0 00:04:00.816 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.816 04:47:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.816 04:47:35 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:00.816 04:47:35 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:00.816 04:47:35 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:00.816 04:47:35 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:00.816 04:47:35 -- setup/common.sh@18 -- # local node=1 00:04:00.816 04:47:35 -- setup/common.sh@19 -- # local var val 00:04:00.816 04:47:35 -- setup/common.sh@20 -- # local mem_f mem 00:04:00.816 04:47:35 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:00.816 04:47:35 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:00.816 04:47:35 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:00.816 04:47:35 -- setup/common.sh@28 -- # mapfile -t mem 00:04:00.816 04:47:35 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.816 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 18803292 kB' 'MemUsed: 8895108 kB' 'SwapCached: 0 kB' 'Active: 4895808 kB' 'Inactive: 1196316 kB' 'Active(anon): 4780728 kB' 'Inactive(anon): 17696 kB' 'Active(file): 115080 kB' 'Inactive(file): 1178620 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5975996 kB' 'Mapped: 43704 kB' 'AnonPages: 116248 kB' 'Shmem: 4682296 kB' 'KernelStack: 9224 kB' 'PageTables: 2548 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224960 kB' 'Slab: 664148 kB' 'SReclaimable: 224960 kB' 'SUnreclaim: 439188 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # continue 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # IFS=': ' 00:04:00.817 04:47:35 -- setup/common.sh@31 -- # read -r var val _ 00:04:00.817 04:47:35 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:00.817 04:47:35 -- setup/common.sh@33 -- # echo 0 00:04:00.817 04:47:35 -- setup/common.sh@33 -- # return 0 00:04:00.817 04:47:35 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:00.817 04:47:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.817 04:47:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.817 04:47:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.817 04:47:35 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:00.817 node0=512 expecting 513 00:04:00.817 04:47:35 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:00.817 04:47:35 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:00.817 04:47:35 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:00.817 04:47:35 -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:00.817 node1=513 expecting 512 00:04:00.817 04:47:35 -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:00.817 00:04:00.818 real 0m3.520s 00:04:00.818 user 0m1.352s 00:04:00.818 sys 0m2.239s 00:04:00.818 04:47:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:00.818 04:47:35 -- common/autotest_common.sh@10 -- # set +x 00:04:00.818 ************************************ 00:04:00.818 END TEST odd_alloc 00:04:00.818 ************************************ 00:04:00.818 04:47:35 -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:00.818 04:47:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:00.818 04:47:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:00.818 04:47:35 -- common/autotest_common.sh@10 -- # set +x 00:04:00.818 ************************************ 00:04:00.818 START TEST custom_alloc 00:04:00.818 ************************************ 00:04:00.818 04:47:35 -- common/autotest_common.sh@1114 -- # custom_alloc 00:04:00.818 04:47:35 -- setup/hugepages.sh@167 -- # local IFS=, 00:04:00.818 04:47:35 -- setup/hugepages.sh@169 -- # local node 00:04:00.818 04:47:35 -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:00.818 04:47:35 -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:00.818 04:47:35 -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:00.818 04:47:35 -- setup/hugepages.sh@49 -- # local size=1048576 00:04:00.818 04:47:35 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:00.818 04:47:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.818 04:47:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:00.818 04:47:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.818 04:47:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:00.818 04:47:35 -- setup/hugepages.sh@83 -- # : 256 00:04:00.818 04:47:35 -- setup/hugepages.sh@84 -- # : 1 00:04:00.818 04:47:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:00.818 04:47:35 -- setup/hugepages.sh@83 -- # : 0 00:04:00.818 04:47:35 -- setup/hugepages.sh@84 -- # : 0 00:04:00.818 04:47:35 -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:00.818 04:47:35 -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:00.818 04:47:35 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:00.818 04:47:35 -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:00.818 04:47:35 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.818 04:47:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:00.818 04:47:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.818 04:47:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:00.818 04:47:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:00.818 04:47:35 -- setup/hugepages.sh@78 -- # return 0 00:04:00.818 04:47:35 -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:00.818 04:47:35 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:00.818 04:47:35 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:00.818 04:47:35 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:00.818 04:47:35 -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:00.818 04:47:35 -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:00.818 04:47:35 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:00.818 04:47:35 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:00.818 04:47:35 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:00.818 04:47:35 -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:00.818 04:47:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:00.818 04:47:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:00.818 04:47:35 -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:00.818 04:47:35 -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:00.818 04:47:35 -- setup/hugepages.sh@78 -- # return 0 00:04:00.818 04:47:35 -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:00.818 04:47:35 -- setup/hugepages.sh@187 -- # setup output 00:04:00.818 04:47:35 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:00.818 04:47:35 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:04.105 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.105 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.105 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.105 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:04.366 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:04.366 04:47:39 -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:04:04.366 04:47:39 -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:04:04.366 04:47:39 -- setup/hugepages.sh@89 -- # local node 00:04:04.366 04:47:39 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:04.366 04:47:39 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:04.366 04:47:39 -- setup/hugepages.sh@92 -- # local surp 00:04:04.366 04:47:39 -- setup/hugepages.sh@93 -- # local resv 00:04:04.366 04:47:39 -- setup/hugepages.sh@94 -- # local anon 00:04:04.366 04:47:39 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:04.366 04:47:39 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:04.366 04:47:39 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:04.366 04:47:39 -- setup/common.sh@18 -- # local node= 00:04:04.366 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.366 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.366 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.366 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.366 04:47:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.366 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.366 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 43254012 kB' 'MemAvailable: 45002624 kB' 'Buffers: 7364 kB' 'Cached: 9468344 kB' 'SwapCached: 32 kB' 'Active: 8530776 kB' 'Inactive: 1466156 kB' 'Active(anon): 8034692 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 524472 kB' 'Mapped: 188308 kB' 'Shmem: 7531224 kB' 'KReclaimable: 476340 kB' 'Slab: 1313796 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837456 kB' 'KernelStack: 21792 kB' 'PageTables: 8040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 9220472 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214416 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.366 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.366 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:04.367 04:47:39 -- setup/common.sh@33 -- # echo 0 00:04:04.367 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.367 04:47:39 -- setup/hugepages.sh@97 -- # anon=0 00:04:04.367 04:47:39 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:04.367 04:47:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.367 04:47:39 -- setup/common.sh@18 -- # local node= 00:04:04.367 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.367 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.367 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.367 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.367 04:47:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.367 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.367 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 43254004 kB' 'MemAvailable: 45002616 kB' 'Buffers: 7364 kB' 'Cached: 9468348 kB' 'SwapCached: 32 kB' 'Active: 8529672 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033588 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523448 kB' 'Mapped: 187940 kB' 'Shmem: 7531228 kB' 'KReclaimable: 476340 kB' 'Slab: 1313824 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837484 kB' 'KernelStack: 21744 kB' 'PageTables: 7612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 9215936 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.367 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.367 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.368 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.368 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.369 04:47:39 -- setup/common.sh@33 -- # echo 0 00:04:04.369 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.369 04:47:39 -- setup/hugepages.sh@99 -- # surp=0 00:04:04.369 04:47:39 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:04.369 04:47:39 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:04.369 04:47:39 -- setup/common.sh@18 -- # local node= 00:04:04.369 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.369 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.369 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.369 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.369 04:47:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.369 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.369 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 43254136 kB' 'MemAvailable: 45002748 kB' 'Buffers: 7364 kB' 'Cached: 9468348 kB' 'SwapCached: 32 kB' 'Active: 8529640 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033556 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523440 kB' 'Mapped: 187940 kB' 'Shmem: 7531228 kB' 'KReclaimable: 476340 kB' 'Slab: 1313824 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837484 kB' 'KernelStack: 21808 kB' 'PageTables: 7980 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 9215952 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.369 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.369 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:04.370 04:47:39 -- setup/common.sh@33 -- # echo 0 00:04:04.370 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.370 04:47:39 -- setup/hugepages.sh@100 -- # resv=0 00:04:04.370 04:47:39 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:04:04.370 nr_hugepages=1536 00:04:04.370 04:47:39 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:04.370 resv_hugepages=0 00:04:04.370 04:47:39 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:04.370 surplus_hugepages=0 00:04:04.370 04:47:39 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:04.370 anon_hugepages=0 00:04:04.370 04:47:39 -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:04.370 04:47:39 -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:04:04.370 04:47:39 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:04.370 04:47:39 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:04.370 04:47:39 -- setup/common.sh@18 -- # local node= 00:04:04.370 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.370 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.370 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.370 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:04.370 04:47:39 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:04.370 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.370 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.370 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 43254824 kB' 'MemAvailable: 45003436 kB' 'Buffers: 7364 kB' 'Cached: 9468376 kB' 'SwapCached: 32 kB' 'Active: 8529496 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033412 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523236 kB' 'Mapped: 187940 kB' 'Shmem: 7531256 kB' 'KReclaimable: 476340 kB' 'Slab: 1313824 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837484 kB' 'KernelStack: 21792 kB' 'PageTables: 7932 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 36957624 kB' 'Committed_AS: 9215968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.370 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.370 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.641 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.641 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.642 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:04.642 04:47:39 -- setup/common.sh@33 -- # echo 1536 00:04:04.642 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.642 04:47:39 -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:04:04.642 04:47:39 -- setup/hugepages.sh@112 -- # get_nodes 00:04:04.642 04:47:39 -- setup/hugepages.sh@27 -- # local node 00:04:04.642 04:47:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.642 04:47:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:04.642 04:47:39 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:04.642 04:47:39 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:04.642 04:47:39 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:04.642 04:47:39 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:04.642 04:47:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:04.642 04:47:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:04.642 04:47:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:04.642 04:47:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.642 04:47:39 -- setup/common.sh@18 -- # local node=0 00:04:04.642 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.642 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.642 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.642 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:04.642 04:47:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:04.642 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.642 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.642 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 25466704 kB' 'MemUsed: 7118664 kB' 'SwapCached: 32 kB' 'Active: 3637096 kB' 'Inactive: 269840 kB' 'Active(anon): 3256092 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499744 kB' 'Mapped: 144236 kB' 'AnonPages: 410320 kB' 'Shmem: 2848928 kB' 'KernelStack: 12536 kB' 'PageTables: 5288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251380 kB' 'Slab: 649844 kB' 'SReclaimable: 251380 kB' 'SUnreclaim: 398464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.643 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.643 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@33 -- # echo 0 00:04:04.644 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.644 04:47:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:04.644 04:47:39 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:04.644 04:47:39 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:04.644 04:47:39 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:04.644 04:47:39 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:04.644 04:47:39 -- setup/common.sh@18 -- # local node=1 00:04:04.644 04:47:39 -- setup/common.sh@19 -- # local var val 00:04:04.644 04:47:39 -- setup/common.sh@20 -- # local mem_f mem 00:04:04.644 04:47:39 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:04.644 04:47:39 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:04.644 04:47:39 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:04.644 04:47:39 -- setup/common.sh@28 -- # mapfile -t mem 00:04:04.644 04:47:39 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 27698400 kB' 'MemFree: 17787648 kB' 'MemUsed: 9910752 kB' 'SwapCached: 0 kB' 'Active: 4892632 kB' 'Inactive: 1196316 kB' 'Active(anon): 4777552 kB' 'Inactive(anon): 17696 kB' 'Active(file): 115080 kB' 'Inactive(file): 1178620 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 5976040 kB' 'Mapped: 43704 kB' 'AnonPages: 113184 kB' 'Shmem: 4682340 kB' 'KernelStack: 9240 kB' 'PageTables: 2592 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 224960 kB' 'Slab: 663972 kB' 'SReclaimable: 224960 kB' 'SUnreclaim: 439012 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.644 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.644 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # continue 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # IFS=': ' 00:04:04.645 04:47:39 -- setup/common.sh@31 -- # read -r var val _ 00:04:04.645 04:47:39 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:04.645 04:47:39 -- setup/common.sh@33 -- # echo 0 00:04:04.645 04:47:39 -- setup/common.sh@33 -- # return 0 00:04:04.645 04:47:39 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:04.645 04:47:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:04.645 04:47:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:04.645 04:47:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:04.645 04:47:39 -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:04.645 node0=512 expecting 512 00:04:04.645 04:47:39 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:04.645 04:47:39 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:04.645 04:47:39 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:04.645 04:47:39 -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:04:04.645 node1=1024 expecting 1024 00:04:04.645 04:47:39 -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:04:04.645 00:04:04.645 real 0m3.668s 00:04:04.645 user 0m1.392s 00:04:04.645 sys 0m2.352s 00:04:04.645 04:47:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:04.645 04:47:39 -- common/autotest_common.sh@10 -- # set +x 00:04:04.645 ************************************ 00:04:04.645 END TEST custom_alloc 00:04:04.645 ************************************ 00:04:04.645 04:47:39 -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:04:04.645 04:47:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:04.645 04:47:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:04.645 04:47:39 -- common/autotest_common.sh@10 -- # set +x 00:04:04.645 ************************************ 00:04:04.645 START TEST no_shrink_alloc 00:04:04.645 ************************************ 00:04:04.645 04:47:39 -- common/autotest_common.sh@1114 -- # no_shrink_alloc 00:04:04.645 04:47:39 -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:04:04.645 04:47:39 -- setup/hugepages.sh@49 -- # local size=2097152 00:04:04.645 04:47:39 -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:04.645 04:47:39 -- setup/hugepages.sh@51 -- # shift 00:04:04.645 04:47:39 -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:04.645 04:47:39 -- setup/hugepages.sh@52 -- # local node_ids 00:04:04.645 04:47:39 -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:04.645 04:47:39 -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:04.645 04:47:39 -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:04.645 04:47:39 -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:04.645 04:47:39 -- setup/hugepages.sh@62 -- # local user_nodes 00:04:04.645 04:47:39 -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:04.645 04:47:39 -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:04.645 04:47:39 -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:04.645 04:47:39 -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:04.645 04:47:39 -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:04.645 04:47:39 -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:04.645 04:47:39 -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:04.645 04:47:39 -- setup/hugepages.sh@73 -- # return 0 00:04:04.645 04:47:39 -- setup/hugepages.sh@198 -- # setup output 00:04:04.645 04:47:39 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.646 04:47:39 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:07.937 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:07.937 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:07.937 04:47:42 -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:04:07.937 04:47:42 -- setup/hugepages.sh@89 -- # local node 00:04:07.937 04:47:42 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:07.937 04:47:42 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:07.937 04:47:42 -- setup/hugepages.sh@92 -- # local surp 00:04:07.937 04:47:42 -- setup/hugepages.sh@93 -- # local resv 00:04:07.937 04:47:42 -- setup/hugepages.sh@94 -- # local anon 00:04:07.937 04:47:42 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:07.937 04:47:42 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:07.937 04:47:42 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:07.937 04:47:42 -- setup/common.sh@18 -- # local node= 00:04:07.937 04:47:42 -- setup/common.sh@19 -- # local var val 00:04:07.937 04:47:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.937 04:47:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.937 04:47:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.937 04:47:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.937 04:47:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.937 04:47:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.937 04:47:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44283924 kB' 'MemAvailable: 46032536 kB' 'Buffers: 7364 kB' 'Cached: 9468468 kB' 'SwapCached: 32 kB' 'Active: 8529588 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033504 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523152 kB' 'Mapped: 188004 kB' 'Shmem: 7531348 kB' 'KReclaimable: 476340 kB' 'Slab: 1313868 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837528 kB' 'KernelStack: 21824 kB' 'PageTables: 8036 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9216576 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214384 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:07.937 04:47:42 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.937 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.937 04:47:42 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.937 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.937 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.937 04:47:42 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.938 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.938 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:07.939 04:47:42 -- setup/common.sh@33 -- # echo 0 00:04:07.939 04:47:42 -- setup/common.sh@33 -- # return 0 00:04:07.939 04:47:42 -- setup/hugepages.sh@97 -- # anon=0 00:04:07.939 04:47:42 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:07.939 04:47:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.939 04:47:42 -- setup/common.sh@18 -- # local node= 00:04:07.939 04:47:42 -- setup/common.sh@19 -- # local var val 00:04:07.939 04:47:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.939 04:47:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.939 04:47:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.939 04:47:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.939 04:47:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.939 04:47:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44282916 kB' 'MemAvailable: 46031528 kB' 'Buffers: 7364 kB' 'Cached: 9468472 kB' 'SwapCached: 32 kB' 'Active: 8529216 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033132 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522784 kB' 'Mapped: 187944 kB' 'Shmem: 7531352 kB' 'KReclaimable: 476340 kB' 'Slab: 1313964 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837624 kB' 'KernelStack: 21808 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9216588 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214368 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.939 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.939 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.940 04:47:42 -- setup/common.sh@33 -- # echo 0 00:04:07.940 04:47:42 -- setup/common.sh@33 -- # return 0 00:04:07.940 04:47:42 -- setup/hugepages.sh@99 -- # surp=0 00:04:07.940 04:47:42 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:07.940 04:47:42 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:07.940 04:47:42 -- setup/common.sh@18 -- # local node= 00:04:07.940 04:47:42 -- setup/common.sh@19 -- # local var val 00:04:07.940 04:47:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.940 04:47:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.940 04:47:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.940 04:47:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.940 04:47:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.940 04:47:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44283396 kB' 'MemAvailable: 46032008 kB' 'Buffers: 7364 kB' 'Cached: 9468484 kB' 'SwapCached: 32 kB' 'Active: 8529600 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033516 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523188 kB' 'Mapped: 188448 kB' 'Shmem: 7531364 kB' 'KReclaimable: 476340 kB' 'Slab: 1313964 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837624 kB' 'KernelStack: 21792 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9217692 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214352 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.940 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.940 04:47:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.941 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.941 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:07.942 04:47:42 -- setup/common.sh@33 -- # echo 0 00:04:07.942 04:47:42 -- setup/common.sh@33 -- # return 0 00:04:07.942 04:47:42 -- setup/hugepages.sh@100 -- # resv=0 00:04:07.942 04:47:42 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:07.942 nr_hugepages=1024 00:04:07.942 04:47:42 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:07.942 resv_hugepages=0 00:04:07.942 04:47:42 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:07.942 surplus_hugepages=0 00:04:07.942 04:47:42 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:07.942 anon_hugepages=0 00:04:07.942 04:47:42 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.942 04:47:42 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:07.942 04:47:42 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:07.942 04:47:42 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:07.942 04:47:42 -- setup/common.sh@18 -- # local node= 00:04:07.942 04:47:42 -- setup/common.sh@19 -- # local var val 00:04:07.942 04:47:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.942 04:47:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.942 04:47:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:07.942 04:47:42 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:07.942 04:47:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.942 04:47:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44279616 kB' 'MemAvailable: 46028228 kB' 'Buffers: 7364 kB' 'Cached: 9468484 kB' 'SwapCached: 32 kB' 'Active: 8532280 kB' 'Inactive: 1466156 kB' 'Active(anon): 8036196 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 525880 kB' 'Mapped: 188448 kB' 'Shmem: 7531364 kB' 'KReclaimable: 476340 kB' 'Slab: 1313964 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 837624 kB' 'KernelStack: 21824 kB' 'PageTables: 8024 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9220348 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.942 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.942 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.943 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.943 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:07.943 04:47:42 -- setup/common.sh@33 -- # echo 1024 00:04:07.943 04:47:42 -- setup/common.sh@33 -- # return 0 00:04:07.943 04:47:42 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:07.943 04:47:42 -- setup/hugepages.sh@112 -- # get_nodes 00:04:07.943 04:47:42 -- setup/hugepages.sh@27 -- # local node 00:04:07.943 04:47:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.943 04:47:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:07.943 04:47:42 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:07.943 04:47:42 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:07.943 04:47:42 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:07.943 04:47:42 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:07.943 04:47:42 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:07.943 04:47:42 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:07.943 04:47:42 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:07.943 04:47:42 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:07.943 04:47:42 -- setup/common.sh@18 -- # local node=0 00:04:07.943 04:47:42 -- setup/common.sh@19 -- # local var val 00:04:07.944 04:47:42 -- setup/common.sh@20 -- # local mem_f mem 00:04:07.944 04:47:42 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:07.944 04:47:42 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:07.944 04:47:42 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:07.944 04:47:42 -- setup/common.sh@28 -- # mapfile -t mem 00:04:07.944 04:47:42 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24402736 kB' 'MemUsed: 8182632 kB' 'SwapCached: 32 kB' 'Active: 3636356 kB' 'Inactive: 269840 kB' 'Active(anon): 3255352 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499788 kB' 'Mapped: 145028 kB' 'AnonPages: 409536 kB' 'Shmem: 2848972 kB' 'KernelStack: 12552 kB' 'PageTables: 5336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251380 kB' 'Slab: 650000 kB' 'SReclaimable: 251380 kB' 'SUnreclaim: 398620 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.944 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.944 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # continue 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # IFS=': ' 00:04:07.945 04:47:42 -- setup/common.sh@31 -- # read -r var val _ 00:04:07.945 04:47:42 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:07.945 04:47:42 -- setup/common.sh@33 -- # echo 0 00:04:07.945 04:47:42 -- setup/common.sh@33 -- # return 0 00:04:07.945 04:47:42 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:07.945 04:47:42 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:07.945 04:47:42 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:07.945 04:47:42 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:07.945 04:47:42 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:07.945 node0=1024 expecting 1024 00:04:07.945 04:47:42 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:07.945 04:47:42 -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:04:07.945 04:47:42 -- setup/hugepages.sh@202 -- # NRHUGE=512 00:04:07.945 04:47:42 -- setup/hugepages.sh@202 -- # setup output 00:04:07.945 04:47:42 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:07.945 04:47:42 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:11.234 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:11.234 0000:d8:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:11.234 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:04:11.234 04:47:46 -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:04:11.234 04:47:46 -- setup/hugepages.sh@89 -- # local node 00:04:11.234 04:47:46 -- setup/hugepages.sh@90 -- # local sorted_t 00:04:11.234 04:47:46 -- setup/hugepages.sh@91 -- # local sorted_s 00:04:11.234 04:47:46 -- setup/hugepages.sh@92 -- # local surp 00:04:11.234 04:47:46 -- setup/hugepages.sh@93 -- # local resv 00:04:11.234 04:47:46 -- setup/hugepages.sh@94 -- # local anon 00:04:11.234 04:47:46 -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:11.234 04:47:46 -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:11.234 04:47:46 -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:11.234 04:47:46 -- setup/common.sh@18 -- # local node= 00:04:11.234 04:47:46 -- setup/common.sh@19 -- # local var val 00:04:11.234 04:47:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.234 04:47:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.234 04:47:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.234 04:47:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.234 04:47:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.234 04:47:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44275664 kB' 'MemAvailable: 46024276 kB' 'Buffers: 7364 kB' 'Cached: 9468588 kB' 'SwapCached: 32 kB' 'Active: 8529916 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033832 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523440 kB' 'Mapped: 188036 kB' 'Shmem: 7531468 kB' 'KReclaimable: 476340 kB' 'Slab: 1314688 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838348 kB' 'KernelStack: 21824 kB' 'PageTables: 8068 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9217224 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214336 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.234 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.234 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:11.235 04:47:46 -- setup/common.sh@33 -- # echo 0 00:04:11.235 04:47:46 -- setup/common.sh@33 -- # return 0 00:04:11.235 04:47:46 -- setup/hugepages.sh@97 -- # anon=0 00:04:11.235 04:47:46 -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:11.235 04:47:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.235 04:47:46 -- setup/common.sh@18 -- # local node= 00:04:11.235 04:47:46 -- setup/common.sh@19 -- # local var val 00:04:11.235 04:47:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.235 04:47:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.235 04:47:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.235 04:47:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.235 04:47:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.235 04:47:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44277280 kB' 'MemAvailable: 46025892 kB' 'Buffers: 7364 kB' 'Cached: 9468592 kB' 'SwapCached: 32 kB' 'Active: 8529808 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033724 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523320 kB' 'Mapped: 187948 kB' 'Shmem: 7531472 kB' 'KReclaimable: 476340 kB' 'Slab: 1314636 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838296 kB' 'KernelStack: 21808 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9217236 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.235 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.235 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.236 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.236 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.236 04:47:46 -- setup/common.sh@33 -- # echo 0 00:04:11.236 04:47:46 -- setup/common.sh@33 -- # return 0 00:04:11.236 04:47:46 -- setup/hugepages.sh@99 -- # surp=0 00:04:11.236 04:47:46 -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:11.236 04:47:46 -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:11.236 04:47:46 -- setup/common.sh@18 -- # local node= 00:04:11.236 04:47:46 -- setup/common.sh@19 -- # local var val 00:04:11.236 04:47:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.236 04:47:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.236 04:47:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.236 04:47:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.236 04:47:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.236 04:47:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44277688 kB' 'MemAvailable: 46026300 kB' 'Buffers: 7364 kB' 'Cached: 9468604 kB' 'SwapCached: 32 kB' 'Active: 8529824 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033740 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 523320 kB' 'Mapped: 187948 kB' 'Shmem: 7531484 kB' 'KReclaimable: 476340 kB' 'Slab: 1314636 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838296 kB' 'KernelStack: 21808 kB' 'PageTables: 7976 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9217248 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214304 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.237 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.237 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:11.238 04:47:46 -- setup/common.sh@33 -- # echo 0 00:04:11.238 04:47:46 -- setup/common.sh@33 -- # return 0 00:04:11.238 04:47:46 -- setup/hugepages.sh@100 -- # resv=0 00:04:11.238 04:47:46 -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:11.238 nr_hugepages=1024 00:04:11.238 04:47:46 -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:11.238 resv_hugepages=0 00:04:11.238 04:47:46 -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:11.238 surplus_hugepages=0 00:04:11.238 04:47:46 -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:11.238 anon_hugepages=0 00:04:11.238 04:47:46 -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:11.238 04:47:46 -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:11.238 04:47:46 -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:11.238 04:47:46 -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:11.238 04:47:46 -- setup/common.sh@18 -- # local node= 00:04:11.238 04:47:46 -- setup/common.sh@19 -- # local var val 00:04:11.238 04:47:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.238 04:47:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.238 04:47:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:11.238 04:47:46 -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:11.238 04:47:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.238 04:47:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.238 04:47:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60283768 kB' 'MemFree: 44277688 kB' 'MemAvailable: 46026300 kB' 'Buffers: 7364 kB' 'Cached: 9468632 kB' 'SwapCached: 32 kB' 'Active: 8529476 kB' 'Inactive: 1466156 kB' 'Active(anon): 8033392 kB' 'Inactive(anon): 17756 kB' 'Active(file): 496084 kB' 'Inactive(file): 1448400 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8386044 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 522912 kB' 'Mapped: 187948 kB' 'Shmem: 7531512 kB' 'KReclaimable: 476340 kB' 'Slab: 1314636 kB' 'SReclaimable: 476340 kB' 'SUnreclaim: 838296 kB' 'KernelStack: 21792 kB' 'PageTables: 7928 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 37481912 kB' 'Committed_AS: 9217264 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 214320 kB' 'VmallocChunk: 0 kB' 'Percpu: 94080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 576884 kB' 'DirectMap2M: 7497728 kB' 'DirectMap1G: 61865984 kB' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.238 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.238 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.239 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.239 04:47:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:11.499 04:47:46 -- setup/common.sh@33 -- # echo 1024 00:04:11.499 04:47:46 -- setup/common.sh@33 -- # return 0 00:04:11.499 04:47:46 -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:11.499 04:47:46 -- setup/hugepages.sh@112 -- # get_nodes 00:04:11.499 04:47:46 -- setup/hugepages.sh@27 -- # local node 00:04:11.499 04:47:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.499 04:47:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:11.499 04:47:46 -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:11.499 04:47:46 -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:11.499 04:47:46 -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:11.499 04:47:46 -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:11.499 04:47:46 -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:11.499 04:47:46 -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:11.499 04:47:46 -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:11.499 04:47:46 -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:11.499 04:47:46 -- setup/common.sh@18 -- # local node=0 00:04:11.499 04:47:46 -- setup/common.sh@19 -- # local var val 00:04:11.499 04:47:46 -- setup/common.sh@20 -- # local mem_f mem 00:04:11.499 04:47:46 -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:11.499 04:47:46 -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:11.499 04:47:46 -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:11.499 04:47:46 -- setup/common.sh@28 -- # mapfile -t mem 00:04:11.499 04:47:46 -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32585368 kB' 'MemFree: 24396980 kB' 'MemUsed: 8188388 kB' 'SwapCached: 32 kB' 'Active: 3636476 kB' 'Inactive: 269840 kB' 'Active(anon): 3255472 kB' 'Inactive(anon): 60 kB' 'Active(file): 381004 kB' 'Inactive(file): 269780 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3499800 kB' 'Mapped: 144244 kB' 'AnonPages: 409736 kB' 'Shmem: 2848984 kB' 'KernelStack: 12584 kB' 'PageTables: 5480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 251380 kB' 'Slab: 650488 kB' 'SReclaimable: 251380 kB' 'SUnreclaim: 399108 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.499 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.499 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # continue 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # IFS=': ' 00:04:11.500 04:47:46 -- setup/common.sh@31 -- # read -r var val _ 00:04:11.500 04:47:46 -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:11.500 04:47:46 -- setup/common.sh@33 -- # echo 0 00:04:11.500 04:47:46 -- setup/common.sh@33 -- # return 0 00:04:11.500 04:47:46 -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:11.500 04:47:46 -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:11.500 04:47:46 -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:11.500 04:47:46 -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:11.500 04:47:46 -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:11.500 node0=1024 expecting 1024 00:04:11.500 04:47:46 -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:11.500 00:04:11.500 real 0m6.753s 00:04:11.500 user 0m2.396s 00:04:11.500 sys 0m4.359s 00:04:11.500 04:47:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.500 04:47:46 -- common/autotest_common.sh@10 -- # set +x 00:04:11.500 ************************************ 00:04:11.500 END TEST no_shrink_alloc 00:04:11.500 ************************************ 00:04:11.500 04:47:46 -- setup/hugepages.sh@217 -- # clear_hp 00:04:11.500 04:47:46 -- setup/hugepages.sh@37 -- # local node hp 00:04:11.500 04:47:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.500 04:47:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.500 04:47:46 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.500 04:47:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.500 04:47:46 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.500 04:47:46 -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:11.500 04:47:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.500 04:47:46 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.500 04:47:46 -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:11.500 04:47:46 -- setup/hugepages.sh@41 -- # echo 0 00:04:11.500 04:47:46 -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:11.500 04:47:46 -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:11.500 00:04:11.500 real 0m26.643s 00:04:11.500 user 0m9.356s 00:04:11.500 sys 0m16.009s 00:04:11.500 04:47:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:11.500 04:47:46 -- common/autotest_common.sh@10 -- # set +x 00:04:11.500 ************************************ 00:04:11.500 END TEST hugepages 00:04:11.500 ************************************ 00:04:11.500 04:47:46 -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:11.500 04:47:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:11.500 04:47:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:11.500 04:47:46 -- common/autotest_common.sh@10 -- # set +x 00:04:11.500 ************************************ 00:04:11.500 START TEST driver 00:04:11.500 ************************************ 00:04:11.500 04:47:46 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/driver.sh 00:04:11.500 * Looking for test storage... 00:04:11.500 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:11.500 04:47:46 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:11.500 04:47:46 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:11.500 04:47:46 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:11.759 04:47:46 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:11.759 04:47:46 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:11.759 04:47:46 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:11.759 04:47:46 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:11.759 04:47:46 -- scripts/common.sh@335 -- # IFS=.-: 00:04:11.759 04:47:46 -- scripts/common.sh@335 -- # read -ra ver1 00:04:11.759 04:47:46 -- scripts/common.sh@336 -- # IFS=.-: 00:04:11.759 04:47:46 -- scripts/common.sh@336 -- # read -ra ver2 00:04:11.759 04:47:46 -- scripts/common.sh@337 -- # local 'op=<' 00:04:11.759 04:47:46 -- scripts/common.sh@339 -- # ver1_l=2 00:04:11.759 04:47:46 -- scripts/common.sh@340 -- # ver2_l=1 00:04:11.759 04:47:46 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:11.759 04:47:46 -- scripts/common.sh@343 -- # case "$op" in 00:04:11.759 04:47:46 -- scripts/common.sh@344 -- # : 1 00:04:11.759 04:47:46 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:11.759 04:47:46 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:11.759 04:47:46 -- scripts/common.sh@364 -- # decimal 1 00:04:11.759 04:47:46 -- scripts/common.sh@352 -- # local d=1 00:04:11.759 04:47:46 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:11.759 04:47:46 -- scripts/common.sh@354 -- # echo 1 00:04:11.759 04:47:46 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:11.759 04:47:46 -- scripts/common.sh@365 -- # decimal 2 00:04:11.759 04:47:46 -- scripts/common.sh@352 -- # local d=2 00:04:11.759 04:47:46 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:11.759 04:47:46 -- scripts/common.sh@354 -- # echo 2 00:04:11.759 04:47:46 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:11.759 04:47:46 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:11.759 04:47:46 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:11.759 04:47:46 -- scripts/common.sh@367 -- # return 0 00:04:11.759 04:47:46 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:11.759 04:47:46 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:11.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.759 --rc genhtml_branch_coverage=1 00:04:11.759 --rc genhtml_function_coverage=1 00:04:11.759 --rc genhtml_legend=1 00:04:11.759 --rc geninfo_all_blocks=1 00:04:11.759 --rc geninfo_unexecuted_blocks=1 00:04:11.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.759 ' 00:04:11.759 04:47:46 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:11.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.759 --rc genhtml_branch_coverage=1 00:04:11.759 --rc genhtml_function_coverage=1 00:04:11.759 --rc genhtml_legend=1 00:04:11.759 --rc geninfo_all_blocks=1 00:04:11.759 --rc geninfo_unexecuted_blocks=1 00:04:11.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.759 ' 00:04:11.759 04:47:46 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:11.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.759 --rc genhtml_branch_coverage=1 00:04:11.759 --rc genhtml_function_coverage=1 00:04:11.759 --rc genhtml_legend=1 00:04:11.759 --rc geninfo_all_blocks=1 00:04:11.759 --rc geninfo_unexecuted_blocks=1 00:04:11.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.759 ' 00:04:11.759 04:47:46 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:11.759 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:11.759 --rc genhtml_branch_coverage=1 00:04:11.759 --rc genhtml_function_coverage=1 00:04:11.759 --rc genhtml_legend=1 00:04:11.759 --rc geninfo_all_blocks=1 00:04:11.759 --rc geninfo_unexecuted_blocks=1 00:04:11.759 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:11.759 ' 00:04:11.759 04:47:46 -- setup/driver.sh@68 -- # setup reset 00:04:11.759 04:47:46 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:11.759 04:47:46 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:17.029 04:47:51 -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:17.029 04:47:51 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:17.029 04:47:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:17.029 04:47:51 -- common/autotest_common.sh@10 -- # set +x 00:04:17.029 ************************************ 00:04:17.029 START TEST guess_driver 00:04:17.029 ************************************ 00:04:17.029 04:47:51 -- common/autotest_common.sh@1114 -- # guess_driver 00:04:17.029 04:47:51 -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:17.029 04:47:51 -- setup/driver.sh@47 -- # local fail=0 00:04:17.029 04:47:51 -- setup/driver.sh@49 -- # pick_driver 00:04:17.029 04:47:51 -- setup/driver.sh@36 -- # vfio 00:04:17.029 04:47:51 -- setup/driver.sh@21 -- # local iommu_grups 00:04:17.029 04:47:51 -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:17.029 04:47:51 -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:17.029 04:47:51 -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:17.029 04:47:51 -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:17.029 04:47:51 -- setup/driver.sh@29 -- # (( 176 > 0 )) 00:04:17.029 04:47:51 -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:17.029 04:47:51 -- setup/driver.sh@14 -- # mod vfio_pci 00:04:17.029 04:47:51 -- setup/driver.sh@12 -- # dep vfio_pci 00:04:17.029 04:47:51 -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:17.029 04:47:51 -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:17.029 insmod /lib/modules/6.8.9-200.fc39.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:17.029 04:47:51 -- setup/driver.sh@30 -- # return 0 00:04:17.029 04:47:51 -- setup/driver.sh@37 -- # echo vfio-pci 00:04:17.029 04:47:51 -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:17.029 04:47:51 -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:17.029 04:47:51 -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:17.029 Looking for driver=vfio-pci 00:04:17.029 04:47:51 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:17.029 04:47:51 -- setup/driver.sh@45 -- # setup output config 00:04:17.029 04:47:51 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:17.029 04:47:51 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.316 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.316 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.316 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:20.317 04:47:54 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:20.317 04:47:54 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:20.317 04:47:54 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:21.693 04:47:56 -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:21.693 04:47:56 -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:21.693 04:47:56 -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:21.693 04:47:56 -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:21.693 04:47:56 -- setup/driver.sh@65 -- # setup reset 00:04:21.693 04:47:56 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:21.693 04:47:56 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:26.960 00:04:26.960 real 0m9.803s 00:04:26.960 user 0m2.427s 00:04:26.960 sys 0m5.090s 00:04:26.960 04:48:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:26.960 04:48:01 -- common/autotest_common.sh@10 -- # set +x 00:04:26.960 ************************************ 00:04:26.960 END TEST guess_driver 00:04:26.960 ************************************ 00:04:26.960 00:04:26.960 real 0m14.854s 00:04:26.960 user 0m3.954s 00:04:26.960 sys 0m7.865s 00:04:26.960 04:48:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:26.960 04:48:01 -- common/autotest_common.sh@10 -- # set +x 00:04:26.960 ************************************ 00:04:26.960 END TEST driver 00:04:26.960 ************************************ 00:04:26.960 04:48:01 -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:26.960 04:48:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:26.960 04:48:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:26.960 04:48:01 -- common/autotest_common.sh@10 -- # set +x 00:04:26.960 ************************************ 00:04:26.960 START TEST devices 00:04:26.960 ************************************ 00:04:26.960 04:48:01 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/devices.sh 00:04:26.960 * Looking for test storage... 00:04:26.960 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup 00:04:26.960 04:48:01 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:04:26.960 04:48:01 -- common/autotest_common.sh@1690 -- # lcov --version 00:04:26.960 04:48:01 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:04:26.960 04:48:01 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:04:26.960 04:48:01 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:04:26.960 04:48:01 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:04:26.960 04:48:01 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:04:26.960 04:48:01 -- scripts/common.sh@335 -- # IFS=.-: 00:04:26.960 04:48:01 -- scripts/common.sh@335 -- # read -ra ver1 00:04:26.960 04:48:01 -- scripts/common.sh@336 -- # IFS=.-: 00:04:26.961 04:48:01 -- scripts/common.sh@336 -- # read -ra ver2 00:04:26.961 04:48:01 -- scripts/common.sh@337 -- # local 'op=<' 00:04:26.961 04:48:01 -- scripts/common.sh@339 -- # ver1_l=2 00:04:26.961 04:48:01 -- scripts/common.sh@340 -- # ver2_l=1 00:04:26.961 04:48:01 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:04:26.961 04:48:01 -- scripts/common.sh@343 -- # case "$op" in 00:04:26.961 04:48:01 -- scripts/common.sh@344 -- # : 1 00:04:26.961 04:48:01 -- scripts/common.sh@363 -- # (( v = 0 )) 00:04:26.961 04:48:01 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:26.961 04:48:01 -- scripts/common.sh@364 -- # decimal 1 00:04:26.961 04:48:01 -- scripts/common.sh@352 -- # local d=1 00:04:26.961 04:48:01 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:26.961 04:48:01 -- scripts/common.sh@354 -- # echo 1 00:04:26.961 04:48:01 -- scripts/common.sh@364 -- # ver1[v]=1 00:04:26.961 04:48:01 -- scripts/common.sh@365 -- # decimal 2 00:04:26.961 04:48:01 -- scripts/common.sh@352 -- # local d=2 00:04:26.961 04:48:01 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:26.961 04:48:01 -- scripts/common.sh@354 -- # echo 2 00:04:26.961 04:48:01 -- scripts/common.sh@365 -- # ver2[v]=2 00:04:26.961 04:48:01 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:04:26.961 04:48:01 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:04:26.961 04:48:01 -- scripts/common.sh@367 -- # return 0 00:04:26.961 04:48:01 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:26.961 04:48:01 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:04:26.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:26.961 --rc genhtml_branch_coverage=1 00:04:26.961 --rc genhtml_function_coverage=1 00:04:26.961 --rc genhtml_legend=1 00:04:26.961 --rc geninfo_all_blocks=1 00:04:26.961 --rc geninfo_unexecuted_blocks=1 00:04:26.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:26.961 ' 00:04:26.961 04:48:01 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:04:26.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:26.961 --rc genhtml_branch_coverage=1 00:04:26.961 --rc genhtml_function_coverage=1 00:04:26.961 --rc genhtml_legend=1 00:04:26.961 --rc geninfo_all_blocks=1 00:04:26.961 --rc geninfo_unexecuted_blocks=1 00:04:26.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:26.961 ' 00:04:26.961 04:48:01 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:04:26.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:26.961 --rc genhtml_branch_coverage=1 00:04:26.961 --rc genhtml_function_coverage=1 00:04:26.961 --rc genhtml_legend=1 00:04:26.961 --rc geninfo_all_blocks=1 00:04:26.961 --rc geninfo_unexecuted_blocks=1 00:04:26.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:26.961 ' 00:04:26.961 04:48:01 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:04:26.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:26.961 --rc genhtml_branch_coverage=1 00:04:26.961 --rc genhtml_function_coverage=1 00:04:26.961 --rc genhtml_legend=1 00:04:26.961 --rc geninfo_all_blocks=1 00:04:26.961 --rc geninfo_unexecuted_blocks=1 00:04:26.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:04:26.961 ' 00:04:26.961 04:48:01 -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:26.961 04:48:01 -- setup/devices.sh@192 -- # setup reset 00:04:26.961 04:48:01 -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:26.961 04:48:01 -- setup/common.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:04:31.204 04:48:05 -- setup/devices.sh@194 -- # get_zoned_devs 00:04:31.204 04:48:05 -- common/autotest_common.sh@1664 -- # zoned_devs=() 00:04:31.204 04:48:05 -- common/autotest_common.sh@1664 -- # local -gA zoned_devs 00:04:31.204 04:48:05 -- common/autotest_common.sh@1665 -- # local nvme bdf 00:04:31.204 04:48:05 -- common/autotest_common.sh@1667 -- # for nvme in /sys/block/nvme* 00:04:31.204 04:48:05 -- common/autotest_common.sh@1668 -- # is_block_zoned nvme0n1 00:04:31.204 04:48:05 -- common/autotest_common.sh@1657 -- # local device=nvme0n1 00:04:31.204 04:48:05 -- common/autotest_common.sh@1659 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:31.204 04:48:05 -- common/autotest_common.sh@1660 -- # [[ none != none ]] 00:04:31.204 04:48:05 -- setup/devices.sh@196 -- # blocks=() 00:04:31.204 04:48:05 -- setup/devices.sh@196 -- # declare -a blocks 00:04:31.204 04:48:05 -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:31.204 04:48:05 -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:31.204 04:48:05 -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:31.204 04:48:05 -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:31.204 04:48:05 -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:31.204 04:48:05 -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:31.204 04:48:05 -- setup/devices.sh@202 -- # pci=0000:d8:00.0 00:04:31.204 04:48:05 -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\d\8\:\0\0\.\0* ]] 00:04:31.204 04:48:05 -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:04:31.204 04:48:05 -- scripts/common.sh@380 -- # local block=nvme0n1 pt 00:04:31.204 04:48:05 -- scripts/common.sh@389 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:04:31.204 No valid GPT data, bailing 00:04:31.204 04:48:05 -- scripts/common.sh@393 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:31.204 04:48:05 -- scripts/common.sh@393 -- # pt= 00:04:31.204 04:48:05 -- scripts/common.sh@394 -- # return 1 00:04:31.204 04:48:05 -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:04:31.204 04:48:05 -- setup/common.sh@76 -- # local dev=nvme0n1 00:04:31.204 04:48:05 -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:04:31.204 04:48:05 -- setup/common.sh@80 -- # echo 1600321314816 00:04:31.204 04:48:05 -- setup/devices.sh@204 -- # (( 1600321314816 >= min_disk_size )) 00:04:31.204 04:48:05 -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:31.204 04:48:05 -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:d8:00.0 00:04:31.204 04:48:05 -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:31.204 04:48:05 -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:04:31.204 04:48:05 -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:31.204 04:48:05 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:31.204 04:48:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:31.204 04:48:05 -- common/autotest_common.sh@10 -- # set +x 00:04:31.204 ************************************ 00:04:31.204 START TEST nvme_mount 00:04:31.204 ************************************ 00:04:31.204 04:48:05 -- common/autotest_common.sh@1114 -- # nvme_mount 00:04:31.204 04:48:05 -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:04:31.204 04:48:05 -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:04:31.204 04:48:05 -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:31.204 04:48:05 -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:31.204 04:48:05 -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:04:31.204 04:48:05 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:31.204 04:48:05 -- setup/common.sh@40 -- # local part_no=1 00:04:31.204 04:48:05 -- setup/common.sh@41 -- # local size=1073741824 00:04:31.204 04:48:05 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:31.204 04:48:05 -- setup/common.sh@44 -- # parts=() 00:04:31.204 04:48:05 -- setup/common.sh@44 -- # local parts 00:04:31.204 04:48:05 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:31.204 04:48:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.204 04:48:05 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:31.204 04:48:05 -- setup/common.sh@46 -- # (( part++ )) 00:04:31.204 04:48:05 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:31.204 04:48:05 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:31.204 04:48:05 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:31.204 04:48:05 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:04:31.476 Creating new GPT entries in memory. 00:04:31.477 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:31.477 other utilities. 00:04:31.477 04:48:06 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:31.477 04:48:06 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:31.477 04:48:06 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:31.477 04:48:06 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:31.477 04:48:06 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:32.411 Creating new GPT entries in memory. 00:04:32.411 The operation has completed successfully. 00:04:32.411 04:48:07 -- setup/common.sh@57 -- # (( part++ )) 00:04:32.411 04:48:07 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:32.411 04:48:07 -- setup/common.sh@62 -- # wait 3638034 00:04:32.670 04:48:07 -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.670 04:48:07 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:32.670 04:48:07 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.670 04:48:07 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:04:32.670 04:48:07 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:04:32.670 04:48:07 -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.670 04:48:07 -- setup/devices.sh@105 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:32.670 04:48:07 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:32.670 04:48:07 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:04:32.670 04:48:07 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:32.670 04:48:07 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:32.670 04:48:07 -- setup/devices.sh@53 -- # local found=0 00:04:32.670 04:48:07 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:32.670 04:48:07 -- setup/devices.sh@56 -- # : 00:04:32.670 04:48:07 -- setup/devices.sh@59 -- # local pci status 00:04:32.670 04:48:07 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:32.670 04:48:07 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:32.670 04:48:07 -- setup/devices.sh@47 -- # setup output config 00:04:32.670 04:48:07 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:32.670 04:48:07 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:04:35.957 04:48:10 -- setup/devices.sh@63 -- # found=1 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:10 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:35.957 04:48:10 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:35.957 04:48:11 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:35.957 04:48:11 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:35.957 04:48:11 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.957 04:48:11 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:35.957 04:48:11 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:35.957 04:48:11 -- setup/devices.sh@110 -- # cleanup_nvme 00:04:35.957 04:48:11 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.957 04:48:11 -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:35.957 04:48:11 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:35.957 04:48:11 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:36.217 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:36.217 04:48:11 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:36.217 04:48:11 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:36.475 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:36.475 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:36.475 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:36.476 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:36.476 04:48:11 -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:36.476 04:48:11 -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:36.476 04:48:11 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:36.476 04:48:11 -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:04:36.476 04:48:11 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:04:36.476 04:48:11 -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:36.476 04:48:11 -- setup/devices.sh@116 -- # verify 0000:d8:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:36.476 04:48:11 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:36.476 04:48:11 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:04:36.476 04:48:11 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:36.476 04:48:11 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:36.476 04:48:11 -- setup/devices.sh@53 -- # local found=0 00:04:36.476 04:48:11 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:36.476 04:48:11 -- setup/devices.sh@56 -- # : 00:04:36.476 04:48:11 -- setup/devices.sh@59 -- # local pci status 00:04:36.476 04:48:11 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:36.476 04:48:11 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:36.476 04:48:11 -- setup/devices.sh@47 -- # setup output config 00:04:36.476 04:48:11 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:36.476 04:48:11 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:04:39.762 04:48:14 -- setup/devices.sh@63 -- # found=1 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:39.762 04:48:14 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:39.762 04:48:14 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.762 04:48:14 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:39.762 04:48:14 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:39.762 04:48:14 -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:39.762 04:48:14 -- setup/devices.sh@125 -- # verify 0000:d8:00.0 data@nvme0n1 '' '' 00:04:39.762 04:48:14 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:39.762 04:48:14 -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:04:39.762 04:48:14 -- setup/devices.sh@50 -- # local mount_point= 00:04:39.762 04:48:14 -- setup/devices.sh@51 -- # local test_file= 00:04:39.762 04:48:14 -- setup/devices.sh@53 -- # local found=0 00:04:39.762 04:48:14 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:39.762 04:48:14 -- setup/devices.sh@59 -- # local pci status 00:04:39.762 04:48:14 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:39.762 04:48:14 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:39.762 04:48:14 -- setup/devices.sh@47 -- # setup output config 00:04:39.762 04:48:14 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:39.762 04:48:14 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:04:43.049 04:48:17 -- setup/devices.sh@63 -- # found=1 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:17 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:43.049 04:48:17 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:43.049 04:48:18 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:43.049 04:48:18 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:43.049 04:48:18 -- setup/devices.sh@68 -- # return 0 00:04:43.049 04:48:18 -- setup/devices.sh@128 -- # cleanup_nvme 00:04:43.049 04:48:18 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:43.049 04:48:18 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:43.049 04:48:18 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:43.049 04:48:18 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:43.049 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:43.049 00:04:43.049 real 0m12.565s 00:04:43.049 user 0m3.619s 00:04:43.049 sys 0m6.870s 00:04:43.049 04:48:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:43.049 04:48:18 -- common/autotest_common.sh@10 -- # set +x 00:04:43.049 ************************************ 00:04:43.049 END TEST nvme_mount 00:04:43.049 ************************************ 00:04:43.049 04:48:18 -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:43.049 04:48:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:04:43.049 04:48:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:04:43.049 04:48:18 -- common/autotest_common.sh@10 -- # set +x 00:04:43.049 ************************************ 00:04:43.049 START TEST dm_mount 00:04:43.049 ************************************ 00:04:43.049 04:48:18 -- common/autotest_common.sh@1114 -- # dm_mount 00:04:43.049 04:48:18 -- setup/devices.sh@144 -- # pv=nvme0n1 00:04:43.049 04:48:18 -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:04:43.049 04:48:18 -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:04:43.049 04:48:18 -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:04:43.049 04:48:18 -- setup/common.sh@39 -- # local disk=nvme0n1 00:04:43.049 04:48:18 -- setup/common.sh@40 -- # local part_no=2 00:04:43.049 04:48:18 -- setup/common.sh@41 -- # local size=1073741824 00:04:43.049 04:48:18 -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:43.049 04:48:18 -- setup/common.sh@44 -- # parts=() 00:04:43.049 04:48:18 -- setup/common.sh@44 -- # local parts 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part = 1 )) 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:43.049 04:48:18 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part++ )) 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:43.049 04:48:18 -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part++ )) 00:04:43.049 04:48:18 -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:43.049 04:48:18 -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:43.049 04:48:18 -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:04:43.049 04:48:18 -- setup/common.sh@53 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:04:43.987 Creating new GPT entries in memory. 00:04:43.987 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:43.987 other utilities. 00:04:43.987 04:48:19 -- setup/common.sh@57 -- # (( part = 1 )) 00:04:43.987 04:48:19 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:43.987 04:48:19 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:43.987 04:48:19 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:43.987 04:48:19 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:04:45.367 Creating new GPT entries in memory. 00:04:45.367 The operation has completed successfully. 00:04:45.367 04:48:20 -- setup/common.sh@57 -- # (( part++ )) 00:04:45.367 04:48:20 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:45.367 04:48:20 -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:45.367 04:48:20 -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:45.367 04:48:20 -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:04:46.304 The operation has completed successfully. 00:04:46.304 04:48:21 -- setup/common.sh@57 -- # (( part++ )) 00:04:46.304 04:48:21 -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:46.304 04:48:21 -- setup/common.sh@62 -- # wait 3642707 00:04:46.304 04:48:21 -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:46.304 04:48:21 -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:46.304 04:48:21 -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:46.304 04:48:21 -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:46.304 04:48:21 -- setup/devices.sh@160 -- # for t in {1..5} 00:04:46.304 04:48:21 -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:46.304 04:48:21 -- setup/devices.sh@161 -- # break 00:04:46.304 04:48:21 -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:46.304 04:48:21 -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:46.304 04:48:21 -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:46.304 04:48:21 -- setup/devices.sh@166 -- # dm=dm-0 00:04:46.304 04:48:21 -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:04:46.304 04:48:21 -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:04:46.304 04:48:21 -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:46.304 04:48:21 -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount size= 00:04:46.304 04:48:21 -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:46.304 04:48:21 -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:46.304 04:48:21 -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:46.304 04:48:21 -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:46.304 04:48:21 -- setup/devices.sh@174 -- # verify 0000:d8:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:46.304 04:48:21 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:46.304 04:48:21 -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:04:46.304 04:48:21 -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:46.304 04:48:21 -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:46.304 04:48:21 -- setup/devices.sh@53 -- # local found=0 00:04:46.304 04:48:21 -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:46.304 04:48:21 -- setup/devices.sh@56 -- # : 00:04:46.304 04:48:21 -- setup/devices.sh@59 -- # local pci status 00:04:46.304 04:48:21 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:46.304 04:48:21 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:46.304 04:48:21 -- setup/devices.sh@47 -- # setup output config 00:04:46.304 04:48:21 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:46.304 04:48:21 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:49.594 04:48:24 -- setup/devices.sh@63 -- # found=1 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:49.594 04:48:24 -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:49.594 04:48:24 -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:49.594 04:48:24 -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:49.594 04:48:24 -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:49.594 04:48:24 -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:49.594 04:48:24 -- setup/devices.sh@184 -- # verify 0000:d8:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:04:49.594 04:48:24 -- setup/devices.sh@48 -- # local dev=0000:d8:00.0 00:04:49.594 04:48:24 -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:04:49.594 04:48:24 -- setup/devices.sh@50 -- # local mount_point= 00:04:49.594 04:48:24 -- setup/devices.sh@51 -- # local test_file= 00:04:49.594 04:48:24 -- setup/devices.sh@53 -- # local found=0 00:04:49.594 04:48:24 -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:49.594 04:48:24 -- setup/devices.sh@59 -- # local pci status 00:04:49.594 04:48:24 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:49.594 04:48:24 -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:d8:00.0 00:04:49.594 04:48:24 -- setup/devices.sh@47 -- # setup output config 00:04:49.594 04:48:24 -- setup/common.sh@9 -- # [[ output == output ]] 00:04:49.594 04:48:24 -- setup/common.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh config 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:d8:00.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:04:52.881 04:48:27 -- setup/devices.sh@63 -- # found=1 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\d\8\:\0\0\.\0 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:52.881 04:48:27 -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:52.881 04:48:27 -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:52.881 04:48:27 -- setup/devices.sh@68 -- # return 0 00:04:52.881 04:48:27 -- setup/devices.sh@187 -- # cleanup_dm 00:04:52.881 04:48:27 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:52.881 04:48:27 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:52.881 04:48:27 -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:52.881 04:48:27 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:04:52.881 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:52.881 04:48:27 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:04:52.881 00:04:52.881 real 0m9.682s 00:04:52.881 user 0m2.444s 00:04:52.881 sys 0m4.318s 00:04:52.881 04:48:27 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:52.881 04:48:27 -- common/autotest_common.sh@10 -- # set +x 00:04:52.881 ************************************ 00:04:52.881 END TEST dm_mount 00:04:52.881 ************************************ 00:04:52.881 04:48:27 -- setup/devices.sh@1 -- # cleanup 00:04:52.881 04:48:27 -- setup/devices.sh@11 -- # cleanup_nvme 00:04:52.881 04:48:27 -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/nvme_mount 00:04:52.881 04:48:27 -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:04:52.881 04:48:27 -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:04:52.881 04:48:27 -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:04:53.140 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:53.140 /dev/nvme0n1: 8 bytes were erased at offset 0x1749a955e00 (gpt): 45 46 49 20 50 41 52 54 00:04:53.140 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:53.140 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:04:53.140 04:48:28 -- setup/devices.sh@12 -- # cleanup_dm 00:04:53.140 04:48:28 -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/dm_mount 00:04:53.140 04:48:28 -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:53.140 04:48:28 -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:04:53.140 04:48:28 -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:04:53.140 04:48:28 -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:04:53.140 04:48:28 -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:04:53.140 00:04:53.140 real 0m26.714s 00:04:53.140 user 0m7.559s 00:04:53.140 sys 0m14.094s 00:04:53.140 04:48:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:53.140 04:48:28 -- common/autotest_common.sh@10 -- # set +x 00:04:53.140 ************************************ 00:04:53.140 END TEST devices 00:04:53.140 ************************************ 00:04:53.140 00:04:53.140 real 1m32.633s 00:04:53.140 user 0m28.545s 00:04:53.140 sys 0m52.881s 00:04:53.140 04:48:28 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:04:53.140 04:48:28 -- common/autotest_common.sh@10 -- # set +x 00:04:53.140 ************************************ 00:04:53.140 END TEST setup.sh 00:04:53.140 ************************************ 00:04:53.140 04:48:28 -- spdk/autotest.sh@126 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh status 00:04:56.430 Hugepages 00:04:56.430 node hugesize free / total 00:04:56.430 node0 1048576kB 0 / 0 00:04:56.430 node0 2048kB 2048 / 2048 00:04:56.430 node1 1048576kB 0 / 0 00:04:56.430 node1 2048kB 0 / 0 00:04:56.430 00:04:56.430 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:56.430 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:56.430 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:56.430 NVMe 0000:d8:00.0 8086 0a54 1 nvme nvme0 nvme0n1 00:04:56.430 04:48:31 -- spdk/autotest.sh@128 -- # uname -s 00:04:56.430 04:48:31 -- spdk/autotest.sh@128 -- # [[ Linux == Linux ]] 00:04:56.430 04:48:31 -- spdk/autotest.sh@130 -- # nvme_namespace_revert 00:04:56.430 04:48:31 -- common/autotest_common.sh@1526 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:04:59.719 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:59.719 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:01.625 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:01.625 04:48:36 -- common/autotest_common.sh@1527 -- # sleep 1 00:05:02.563 04:48:37 -- common/autotest_common.sh@1528 -- # bdfs=() 00:05:02.563 04:48:37 -- common/autotest_common.sh@1528 -- # local bdfs 00:05:02.563 04:48:37 -- common/autotest_common.sh@1529 -- # bdfs=($(get_nvme_bdfs)) 00:05:02.563 04:48:37 -- common/autotest_common.sh@1529 -- # get_nvme_bdfs 00:05:02.563 04:48:37 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:02.563 04:48:37 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:02.563 04:48:37 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:02.563 04:48:37 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:02.563 04:48:37 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:02.563 04:48:37 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:02.563 04:48:37 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:02.563 04:48:37 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh reset 00:05:05.846 Waiting for block devices as requested 00:05:05.846 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:05.846 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:05.846 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:05.846 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:05.846 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:05.846 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:06.103 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:06.104 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:06.104 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:05:06.362 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:05:06.362 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:05:06.362 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:05:06.362 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:05:06.621 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:05:06.621 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:05:06.621 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:05:06.880 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:05:06.880 04:48:41 -- common/autotest_common.sh@1533 -- # for bdf in "${bdfs[@]}" 00:05:06.880 04:48:41 -- common/autotest_common.sh@1534 -- # get_nvme_ctrlr_from_bdf 0000:d8:00.0 00:05:06.880 04:48:41 -- common/autotest_common.sh@1497 -- # readlink -f /sys/class/nvme/nvme0 00:05:06.880 04:48:41 -- common/autotest_common.sh@1497 -- # grep 0000:d8:00.0/nvme/nvme 00:05:06.880 04:48:41 -- common/autotest_common.sh@1497 -- # bdf_sysfs_path=/sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:06.880 04:48:41 -- common/autotest_common.sh@1498 -- # [[ -z /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 ]] 00:05:06.880 04:48:41 -- common/autotest_common.sh@1502 -- # basename /sys/devices/pci0000:d7/0000:d7:00.0/0000:d8:00.0/nvme/nvme0 00:05:06.880 04:48:41 -- common/autotest_common.sh@1502 -- # printf '%s\n' nvme0 00:05:07.138 04:48:41 -- common/autotest_common.sh@1534 -- # nvme_ctrlr=/dev/nvme0 00:05:07.138 04:48:41 -- common/autotest_common.sh@1535 -- # [[ -z /dev/nvme0 ]] 00:05:07.138 04:48:41 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:05:07.138 04:48:41 -- common/autotest_common.sh@1540 -- # grep oacs 00:05:07.138 04:48:41 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:05:07.138 04:48:42 -- common/autotest_common.sh@1540 -- # oacs=' 0xe' 00:05:07.138 04:48:42 -- common/autotest_common.sh@1541 -- # oacs_ns_manage=8 00:05:07.138 04:48:42 -- common/autotest_common.sh@1543 -- # [[ 8 -ne 0 ]] 00:05:07.138 04:48:42 -- common/autotest_common.sh@1549 -- # nvme id-ctrl /dev/nvme0 00:05:07.138 04:48:42 -- common/autotest_common.sh@1549 -- # grep unvmcap 00:05:07.138 04:48:42 -- common/autotest_common.sh@1549 -- # cut -d: -f2 00:05:07.138 04:48:42 -- common/autotest_common.sh@1549 -- # unvmcap=' 0' 00:05:07.138 04:48:42 -- common/autotest_common.sh@1550 -- # [[ 0 -eq 0 ]] 00:05:07.138 04:48:42 -- common/autotest_common.sh@1552 -- # continue 00:05:07.138 04:48:42 -- spdk/autotest.sh@133 -- # timing_exit pre_cleanup 00:05:07.138 04:48:42 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:07.138 04:48:42 -- common/autotest_common.sh@10 -- # set +x 00:05:07.138 04:48:42 -- spdk/autotest.sh@136 -- # timing_enter afterboot 00:05:07.138 04:48:42 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:07.138 04:48:42 -- common/autotest_common.sh@10 -- # set +x 00:05:07.138 04:48:42 -- spdk/autotest.sh@137 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/setup.sh 00:05:10.459 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:05:10.459 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:05:12.364 0000:d8:00.0 (8086 0a54): nvme -> vfio-pci 00:05:12.364 04:48:47 -- spdk/autotest.sh@138 -- # timing_exit afterboot 00:05:12.364 04:48:47 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:12.364 04:48:47 -- common/autotest_common.sh@10 -- # set +x 00:05:12.364 04:48:47 -- spdk/autotest.sh@142 -- # opal_revert_cleanup 00:05:12.364 04:48:47 -- common/autotest_common.sh@1586 -- # mapfile -t bdfs 00:05:12.364 04:48:47 -- common/autotest_common.sh@1586 -- # get_nvme_bdfs_by_id 0x0a54 00:05:12.364 04:48:47 -- common/autotest_common.sh@1572 -- # bdfs=() 00:05:12.364 04:48:47 -- common/autotest_common.sh@1572 -- # local bdfs 00:05:12.364 04:48:47 -- common/autotest_common.sh@1574 -- # get_nvme_bdfs 00:05:12.364 04:48:47 -- common/autotest_common.sh@1508 -- # bdfs=() 00:05:12.364 04:48:47 -- common/autotest_common.sh@1508 -- # local bdfs 00:05:12.364 04:48:47 -- common/autotest_common.sh@1509 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:05:12.364 04:48:47 -- common/autotest_common.sh@1509 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/gen_nvme.sh 00:05:12.364 04:48:47 -- common/autotest_common.sh@1509 -- # jq -r '.config[].params.traddr' 00:05:12.364 04:48:47 -- common/autotest_common.sh@1510 -- # (( 1 == 0 )) 00:05:12.364 04:48:47 -- common/autotest_common.sh@1514 -- # printf '%s\n' 0000:d8:00.0 00:05:12.364 04:48:47 -- common/autotest_common.sh@1574 -- # for bdf in $(get_nvme_bdfs) 00:05:12.364 04:48:47 -- common/autotest_common.sh@1575 -- # cat /sys/bus/pci/devices/0000:d8:00.0/device 00:05:12.364 04:48:47 -- common/autotest_common.sh@1575 -- # device=0x0a54 00:05:12.364 04:48:47 -- common/autotest_common.sh@1576 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:05:12.364 04:48:47 -- common/autotest_common.sh@1577 -- # bdfs+=($bdf) 00:05:12.364 04:48:47 -- common/autotest_common.sh@1581 -- # printf '%s\n' 0000:d8:00.0 00:05:12.364 04:48:47 -- common/autotest_common.sh@1587 -- # [[ -z 0000:d8:00.0 ]] 00:05:12.364 04:48:47 -- common/autotest_common.sh@1592 -- # spdk_tgt_pid=3652517 00:05:12.364 04:48:47 -- common/autotest_common.sh@1591 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:12.364 04:48:47 -- common/autotest_common.sh@1593 -- # waitforlisten 3652517 00:05:12.364 04:48:47 -- common/autotest_common.sh@829 -- # '[' -z 3652517 ']' 00:05:12.364 04:48:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:12.364 04:48:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:12.364 04:48:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:12.364 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:12.364 04:48:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:12.364 04:48:47 -- common/autotest_common.sh@10 -- # set +x 00:05:12.364 [2024-11-08 04:48:47.417849] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:12.364 [2024-11-08 04:48:47.417918] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3652517 ] 00:05:12.364 EAL: No free 2048 kB hugepages reported on node 1 00:05:12.623 [2024-11-08 04:48:47.488124] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:12.623 [2024-11-08 04:48:47.564613] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:12.623 [2024-11-08 04:48:47.564723] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:13.189 04:48:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:13.189 04:48:48 -- common/autotest_common.sh@862 -- # return 0 00:05:13.189 04:48:48 -- common/autotest_common.sh@1595 -- # bdf_id=0 00:05:13.189 04:48:48 -- common/autotest_common.sh@1596 -- # for bdf in "${bdfs[@]}" 00:05:13.189 04:48:48 -- common/autotest_common.sh@1597 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:d8:00.0 00:05:16.475 nvme0n1 00:05:16.475 04:48:51 -- common/autotest_common.sh@1599 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:16.475 [2024-11-08 04:48:51.412445] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:05:16.475 request: 00:05:16.475 { 00:05:16.475 "nvme_ctrlr_name": "nvme0", 00:05:16.475 "password": "test", 00:05:16.475 "method": "bdev_nvme_opal_revert", 00:05:16.475 "req_id": 1 00:05:16.475 } 00:05:16.475 Got JSON-RPC error response 00:05:16.475 response: 00:05:16.475 { 00:05:16.475 "code": -32602, 00:05:16.475 "message": "Invalid parameters" 00:05:16.475 } 00:05:16.475 04:48:51 -- common/autotest_common.sh@1599 -- # true 00:05:16.475 04:48:51 -- common/autotest_common.sh@1600 -- # (( ++bdf_id )) 00:05:16.475 04:48:51 -- common/autotest_common.sh@1603 -- # killprocess 3652517 00:05:16.475 04:48:51 -- common/autotest_common.sh@936 -- # '[' -z 3652517 ']' 00:05:16.475 04:48:51 -- common/autotest_common.sh@940 -- # kill -0 3652517 00:05:16.475 04:48:51 -- common/autotest_common.sh@941 -- # uname 00:05:16.475 04:48:51 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:16.475 04:48:51 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3652517 00:05:16.475 04:48:51 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:16.475 04:48:51 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:16.475 04:48:51 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3652517' 00:05:16.475 killing process with pid 3652517 00:05:16.475 04:48:51 -- common/autotest_common.sh@955 -- # kill 3652517 00:05:16.475 04:48:51 -- common/autotest_common.sh@960 -- # wait 3652517 00:05:19.007 04:48:53 -- spdk/autotest.sh@148 -- # '[' 0 -eq 1 ']' 00:05:19.007 04:48:53 -- spdk/autotest.sh@152 -- # '[' 1 -eq 1 ']' 00:05:19.007 04:48:53 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:19.007 04:48:53 -- spdk/autotest.sh@153 -- # [[ 0 -eq 1 ]] 00:05:19.007 04:48:53 -- spdk/autotest.sh@160 -- # timing_enter lib 00:05:19.007 04:48:53 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:19.007 04:48:53 -- common/autotest_common.sh@10 -- # set +x 00:05:19.007 04:48:53 -- spdk/autotest.sh@162 -- # run_test env /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:19.007 04:48:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.007 04:48:53 -- common/autotest_common.sh@10 -- # set +x 00:05:19.007 ************************************ 00:05:19.007 START TEST env 00:05:19.007 ************************************ 00:05:19.007 04:48:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env.sh 00:05:19.007 * Looking for test storage... 00:05:19.007 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env 00:05:19.007 04:48:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:19.007 04:48:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:19.007 04:48:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:19.007 04:48:53 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:19.007 04:48:53 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:19.007 04:48:53 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:19.007 04:48:53 -- scripts/common.sh@335 -- # IFS=.-: 00:05:19.007 04:48:53 -- scripts/common.sh@335 -- # read -ra ver1 00:05:19.007 04:48:53 -- scripts/common.sh@336 -- # IFS=.-: 00:05:19.007 04:48:53 -- scripts/common.sh@336 -- # read -ra ver2 00:05:19.007 04:48:53 -- scripts/common.sh@337 -- # local 'op=<' 00:05:19.007 04:48:53 -- scripts/common.sh@339 -- # ver1_l=2 00:05:19.007 04:48:53 -- scripts/common.sh@340 -- # ver2_l=1 00:05:19.007 04:48:53 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:19.007 04:48:53 -- scripts/common.sh@343 -- # case "$op" in 00:05:19.007 04:48:53 -- scripts/common.sh@344 -- # : 1 00:05:19.007 04:48:53 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:19.007 04:48:53 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:19.007 04:48:53 -- scripts/common.sh@364 -- # decimal 1 00:05:19.007 04:48:53 -- scripts/common.sh@352 -- # local d=1 00:05:19.007 04:48:53 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:19.007 04:48:53 -- scripts/common.sh@354 -- # echo 1 00:05:19.007 04:48:53 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:19.007 04:48:53 -- scripts/common.sh@365 -- # decimal 2 00:05:19.007 04:48:53 -- scripts/common.sh@352 -- # local d=2 00:05:19.007 04:48:53 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:19.007 04:48:53 -- scripts/common.sh@354 -- # echo 2 00:05:19.007 04:48:53 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:19.007 04:48:53 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:19.007 04:48:53 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:19.007 04:48:53 -- scripts/common.sh@367 -- # return 0 00:05:19.007 04:48:53 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:19.007 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.007 --rc genhtml_branch_coverage=1 00:05:19.007 --rc genhtml_function_coverage=1 00:05:19.007 --rc genhtml_legend=1 00:05:19.007 --rc geninfo_all_blocks=1 00:05:19.007 --rc geninfo_unexecuted_blocks=1 00:05:19.007 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.007 ' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:19.007 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.007 --rc genhtml_branch_coverage=1 00:05:19.007 --rc genhtml_function_coverage=1 00:05:19.007 --rc genhtml_legend=1 00:05:19.007 --rc geninfo_all_blocks=1 00:05:19.007 --rc geninfo_unexecuted_blocks=1 00:05:19.007 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.007 ' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:19.007 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.007 --rc genhtml_branch_coverage=1 00:05:19.007 --rc genhtml_function_coverage=1 00:05:19.007 --rc genhtml_legend=1 00:05:19.007 --rc geninfo_all_blocks=1 00:05:19.007 --rc geninfo_unexecuted_blocks=1 00:05:19.007 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.007 ' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:19.007 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.007 --rc genhtml_branch_coverage=1 00:05:19.007 --rc genhtml_function_coverage=1 00:05:19.007 --rc genhtml_legend=1 00:05:19.007 --rc geninfo_all_blocks=1 00:05:19.007 --rc geninfo_unexecuted_blocks=1 00:05:19.007 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:19.007 ' 00:05:19.007 04:48:53 -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:19.007 04:48:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.007 04:48:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.007 04:48:53 -- common/autotest_common.sh@10 -- # set +x 00:05:19.007 ************************************ 00:05:19.007 START TEST env_memory 00:05:19.007 ************************************ 00:05:19.007 04:48:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/memory/memory_ut 00:05:19.007 00:05:19.007 00:05:19.007 CUnit - A unit testing framework for C - Version 2.1-3 00:05:19.007 http://cunit.sourceforge.net/ 00:05:19.007 00:05:19.007 00:05:19.007 Suite: memory 00:05:19.007 Test: alloc and free memory map ...[2024-11-08 04:48:53.944935] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:19.008 passed 00:05:19.008 Test: mem map translation ...[2024-11-08 04:48:53.958052] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:19.008 [2024-11-08 04:48:53.958070] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 591:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:19.008 [2024-11-08 04:48:53.958099] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:19.008 [2024-11-08 04:48:53.958108] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:19.008 passed 00:05:19.008 Test: mem map registration ...[2024-11-08 04:48:53.977612] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:19.008 [2024-11-08 04:48:53.977628] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:19.008 passed 00:05:19.008 Test: mem map adjacent registrations ...passed 00:05:19.008 00:05:19.008 Run Summary: Type Total Ran Passed Failed Inactive 00:05:19.008 suites 1 1 n/a 0 0 00:05:19.008 tests 4 4 4 0 0 00:05:19.008 asserts 152 152 152 0 n/a 00:05:19.008 00:05:19.008 Elapsed time = 0.083 seconds 00:05:19.008 00:05:19.008 real 0m0.096s 00:05:19.008 user 0m0.081s 00:05:19.008 sys 0m0.015s 00:05:19.008 04:48:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:19.008 04:48:54 -- common/autotest_common.sh@10 -- # set +x 00:05:19.008 ************************************ 00:05:19.008 END TEST env_memory 00:05:19.008 ************************************ 00:05:19.008 04:48:54 -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:19.008 04:48:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:19.008 04:48:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:19.008 04:48:54 -- common/autotest_common.sh@10 -- # set +x 00:05:19.008 ************************************ 00:05:19.008 START TEST env_vtophys 00:05:19.008 ************************************ 00:05:19.008 04:48:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:19.008 EAL: lib.eal log level changed from notice to debug 00:05:19.008 EAL: Detected lcore 0 as core 0 on socket 0 00:05:19.008 EAL: Detected lcore 1 as core 1 on socket 0 00:05:19.008 EAL: Detected lcore 2 as core 2 on socket 0 00:05:19.008 EAL: Detected lcore 3 as core 3 on socket 0 00:05:19.008 EAL: Detected lcore 4 as core 4 on socket 0 00:05:19.008 EAL: Detected lcore 5 as core 5 on socket 0 00:05:19.008 EAL: Detected lcore 6 as core 6 on socket 0 00:05:19.008 EAL: Detected lcore 7 as core 8 on socket 0 00:05:19.008 EAL: Detected lcore 8 as core 9 on socket 0 00:05:19.008 EAL: Detected lcore 9 as core 10 on socket 0 00:05:19.008 EAL: Detected lcore 10 as core 11 on socket 0 00:05:19.008 EAL: Detected lcore 11 as core 12 on socket 0 00:05:19.008 EAL: Detected lcore 12 as core 13 on socket 0 00:05:19.008 EAL: Detected lcore 13 as core 14 on socket 0 00:05:19.008 EAL: Detected lcore 14 as core 16 on socket 0 00:05:19.008 EAL: Detected lcore 15 as core 17 on socket 0 00:05:19.008 EAL: Detected lcore 16 as core 18 on socket 0 00:05:19.008 EAL: Detected lcore 17 as core 19 on socket 0 00:05:19.008 EAL: Detected lcore 18 as core 20 on socket 0 00:05:19.008 EAL: Detected lcore 19 as core 21 on socket 0 00:05:19.008 EAL: Detected lcore 20 as core 22 on socket 0 00:05:19.008 EAL: Detected lcore 21 as core 24 on socket 0 00:05:19.008 EAL: Detected lcore 22 as core 25 on socket 0 00:05:19.008 EAL: Detected lcore 23 as core 26 on socket 0 00:05:19.008 EAL: Detected lcore 24 as core 27 on socket 0 00:05:19.008 EAL: Detected lcore 25 as core 28 on socket 0 00:05:19.008 EAL: Detected lcore 26 as core 29 on socket 0 00:05:19.008 EAL: Detected lcore 27 as core 30 on socket 0 00:05:19.008 EAL: Detected lcore 28 as core 0 on socket 1 00:05:19.008 EAL: Detected lcore 29 as core 1 on socket 1 00:05:19.008 EAL: Detected lcore 30 as core 2 on socket 1 00:05:19.008 EAL: Detected lcore 31 as core 3 on socket 1 00:05:19.008 EAL: Detected lcore 32 as core 4 on socket 1 00:05:19.008 EAL: Detected lcore 33 as core 5 on socket 1 00:05:19.008 EAL: Detected lcore 34 as core 6 on socket 1 00:05:19.008 EAL: Detected lcore 35 as core 8 on socket 1 00:05:19.008 EAL: Detected lcore 36 as core 9 on socket 1 00:05:19.008 EAL: Detected lcore 37 as core 10 on socket 1 00:05:19.008 EAL: Detected lcore 38 as core 11 on socket 1 00:05:19.008 EAL: Detected lcore 39 as core 12 on socket 1 00:05:19.008 EAL: Detected lcore 40 as core 13 on socket 1 00:05:19.008 EAL: Detected lcore 41 as core 14 on socket 1 00:05:19.008 EAL: Detected lcore 42 as core 16 on socket 1 00:05:19.008 EAL: Detected lcore 43 as core 17 on socket 1 00:05:19.008 EAL: Detected lcore 44 as core 18 on socket 1 00:05:19.008 EAL: Detected lcore 45 as core 19 on socket 1 00:05:19.008 EAL: Detected lcore 46 as core 20 on socket 1 00:05:19.008 EAL: Detected lcore 47 as core 21 on socket 1 00:05:19.008 EAL: Detected lcore 48 as core 22 on socket 1 00:05:19.008 EAL: Detected lcore 49 as core 24 on socket 1 00:05:19.008 EAL: Detected lcore 50 as core 25 on socket 1 00:05:19.008 EAL: Detected lcore 51 as core 26 on socket 1 00:05:19.008 EAL: Detected lcore 52 as core 27 on socket 1 00:05:19.008 EAL: Detected lcore 53 as core 28 on socket 1 00:05:19.008 EAL: Detected lcore 54 as core 29 on socket 1 00:05:19.008 EAL: Detected lcore 55 as core 30 on socket 1 00:05:19.008 EAL: Detected lcore 56 as core 0 on socket 0 00:05:19.008 EAL: Detected lcore 57 as core 1 on socket 0 00:05:19.008 EAL: Detected lcore 58 as core 2 on socket 0 00:05:19.008 EAL: Detected lcore 59 as core 3 on socket 0 00:05:19.008 EAL: Detected lcore 60 as core 4 on socket 0 00:05:19.008 EAL: Detected lcore 61 as core 5 on socket 0 00:05:19.008 EAL: Detected lcore 62 as core 6 on socket 0 00:05:19.008 EAL: Detected lcore 63 as core 8 on socket 0 00:05:19.008 EAL: Detected lcore 64 as core 9 on socket 0 00:05:19.008 EAL: Detected lcore 65 as core 10 on socket 0 00:05:19.008 EAL: Detected lcore 66 as core 11 on socket 0 00:05:19.008 EAL: Detected lcore 67 as core 12 on socket 0 00:05:19.008 EAL: Detected lcore 68 as core 13 on socket 0 00:05:19.008 EAL: Detected lcore 69 as core 14 on socket 0 00:05:19.008 EAL: Detected lcore 70 as core 16 on socket 0 00:05:19.008 EAL: Detected lcore 71 as core 17 on socket 0 00:05:19.008 EAL: Detected lcore 72 as core 18 on socket 0 00:05:19.008 EAL: Detected lcore 73 as core 19 on socket 0 00:05:19.008 EAL: Detected lcore 74 as core 20 on socket 0 00:05:19.008 EAL: Detected lcore 75 as core 21 on socket 0 00:05:19.008 EAL: Detected lcore 76 as core 22 on socket 0 00:05:19.008 EAL: Detected lcore 77 as core 24 on socket 0 00:05:19.008 EAL: Detected lcore 78 as core 25 on socket 0 00:05:19.008 EAL: Detected lcore 79 as core 26 on socket 0 00:05:19.008 EAL: Detected lcore 80 as core 27 on socket 0 00:05:19.008 EAL: Detected lcore 81 as core 28 on socket 0 00:05:19.008 EAL: Detected lcore 82 as core 29 on socket 0 00:05:19.008 EAL: Detected lcore 83 as core 30 on socket 0 00:05:19.008 EAL: Detected lcore 84 as core 0 on socket 1 00:05:19.008 EAL: Detected lcore 85 as core 1 on socket 1 00:05:19.008 EAL: Detected lcore 86 as core 2 on socket 1 00:05:19.008 EAL: Detected lcore 87 as core 3 on socket 1 00:05:19.008 EAL: Detected lcore 88 as core 4 on socket 1 00:05:19.008 EAL: Detected lcore 89 as core 5 on socket 1 00:05:19.008 EAL: Detected lcore 90 as core 6 on socket 1 00:05:19.008 EAL: Detected lcore 91 as core 8 on socket 1 00:05:19.008 EAL: Detected lcore 92 as core 9 on socket 1 00:05:19.008 EAL: Detected lcore 93 as core 10 on socket 1 00:05:19.008 EAL: Detected lcore 94 as core 11 on socket 1 00:05:19.008 EAL: Detected lcore 95 as core 12 on socket 1 00:05:19.008 EAL: Detected lcore 96 as core 13 on socket 1 00:05:19.008 EAL: Detected lcore 97 as core 14 on socket 1 00:05:19.008 EAL: Detected lcore 98 as core 16 on socket 1 00:05:19.008 EAL: Detected lcore 99 as core 17 on socket 1 00:05:19.008 EAL: Detected lcore 100 as core 18 on socket 1 00:05:19.008 EAL: Detected lcore 101 as core 19 on socket 1 00:05:19.008 EAL: Detected lcore 102 as core 20 on socket 1 00:05:19.008 EAL: Detected lcore 103 as core 21 on socket 1 00:05:19.008 EAL: Detected lcore 104 as core 22 on socket 1 00:05:19.008 EAL: Detected lcore 105 as core 24 on socket 1 00:05:19.008 EAL: Detected lcore 106 as core 25 on socket 1 00:05:19.008 EAL: Detected lcore 107 as core 26 on socket 1 00:05:19.008 EAL: Detected lcore 108 as core 27 on socket 1 00:05:19.008 EAL: Detected lcore 109 as core 28 on socket 1 00:05:19.008 EAL: Detected lcore 110 as core 29 on socket 1 00:05:19.008 EAL: Detected lcore 111 as core 30 on socket 1 00:05:19.008 EAL: Maximum logical cores by configuration: 128 00:05:19.008 EAL: Detected CPU lcores: 112 00:05:19.008 EAL: Detected NUMA nodes: 2 00:05:19.008 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:05:19.008 EAL: Checking presence of .so 'librte_eal.so.24' 00:05:19.008 EAL: Checking presence of .so 'librte_eal.so' 00:05:19.008 EAL: Detected static linkage of DPDK 00:05:19.008 EAL: No shared files mode enabled, IPC will be disabled 00:05:19.008 EAL: Bus pci wants IOVA as 'DC' 00:05:19.008 EAL: Buses did not request a specific IOVA mode. 00:05:19.008 EAL: IOMMU is available, selecting IOVA as VA mode. 00:05:19.008 EAL: Selected IOVA mode 'VA' 00:05:19.008 EAL: No free 2048 kB hugepages reported on node 1 00:05:19.008 EAL: Probing VFIO support... 00:05:19.008 EAL: IOMMU type 1 (Type 1) is supported 00:05:19.008 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:19.008 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:19.008 EAL: VFIO support initialized 00:05:19.008 EAL: Ask a virtual area of 0x2e000 bytes 00:05:19.008 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:19.008 EAL: Setting up physically contiguous memory... 00:05:19.008 EAL: Setting maximum number of open files to 524288 00:05:19.008 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:19.008 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:19.008 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:19.008 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.008 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:19.008 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:19.008 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.008 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:19.008 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:19.009 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:19.009 EAL: Ask a virtual area of 0x61000 bytes 00:05:19.009 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:19.009 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:19.009 EAL: Ask a virtual area of 0x400000000 bytes 00:05:19.009 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:19.009 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:19.009 EAL: Hugepages will be freed exactly as allocated. 00:05:19.009 EAL: No shared files mode enabled, IPC is disabled 00:05:19.009 EAL: No shared files mode enabled, IPC is disabled 00:05:19.009 EAL: TSC frequency is ~2500000 KHz 00:05:19.009 EAL: Main lcore 0 is ready (tid=7f320e4a0a00;cpuset=[0]) 00:05:19.009 EAL: Trying to obtain current memory policy. 00:05:19.009 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.009 EAL: Restoring previous memory policy: 0 00:05:19.009 EAL: request: mp_malloc_sync 00:05:19.009 EAL: No shared files mode enabled, IPC is disabled 00:05:19.009 EAL: Heap on socket 0 was expanded by 2MB 00:05:19.009 EAL: No shared files mode enabled, IPC is disabled 00:05:19.267 EAL: Mem event callback 'spdk:(nil)' registered 00:05:19.267 00:05:19.267 00:05:19.267 CUnit - A unit testing framework for C - Version 2.1-3 00:05:19.267 http://cunit.sourceforge.net/ 00:05:19.267 00:05:19.267 00:05:19.267 Suite: components_suite 00:05:19.267 Test: vtophys_malloc_test ...passed 00:05:19.267 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:19.267 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 4MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 4MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 6MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 6MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 10MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 10MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 18MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 18MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 34MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 34MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 66MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 66MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 130MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 130MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.268 EAL: Restoring previous memory policy: 4 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was expanded by 258MB 00:05:19.268 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.268 EAL: request: mp_malloc_sync 00:05:19.268 EAL: No shared files mode enabled, IPC is disabled 00:05:19.268 EAL: Heap on socket 0 was shrunk by 258MB 00:05:19.268 EAL: Trying to obtain current memory policy. 00:05:19.268 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.526 EAL: Restoring previous memory policy: 4 00:05:19.526 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.526 EAL: request: mp_malloc_sync 00:05:19.526 EAL: No shared files mode enabled, IPC is disabled 00:05:19.526 EAL: Heap on socket 0 was expanded by 514MB 00:05:19.526 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.526 EAL: request: mp_malloc_sync 00:05:19.526 EAL: No shared files mode enabled, IPC is disabled 00:05:19.526 EAL: Heap on socket 0 was shrunk by 514MB 00:05:19.526 EAL: Trying to obtain current memory policy. 00:05:19.526 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:19.784 EAL: Restoring previous memory policy: 4 00:05:19.784 EAL: Calling mem event callback 'spdk:(nil)' 00:05:19.784 EAL: request: mp_malloc_sync 00:05:19.784 EAL: No shared files mode enabled, IPC is disabled 00:05:19.784 EAL: Heap on socket 0 was expanded by 1026MB 00:05:20.043 EAL: Calling mem event callback 'spdk:(nil)' 00:05:20.043 EAL: request: mp_malloc_sync 00:05:20.043 EAL: No shared files mode enabled, IPC is disabled 00:05:20.043 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:20.043 passed 00:05:20.043 00:05:20.043 Run Summary: Type Total Ran Passed Failed Inactive 00:05:20.043 suites 1 1 n/a 0 0 00:05:20.043 tests 2 2 2 0 0 00:05:20.043 asserts 497 497 497 0 n/a 00:05:20.043 00:05:20.043 Elapsed time = 0.960 seconds 00:05:20.043 EAL: Calling mem event callback 'spdk:(nil)' 00:05:20.043 EAL: request: mp_malloc_sync 00:05:20.043 EAL: No shared files mode enabled, IPC is disabled 00:05:20.043 EAL: Heap on socket 0 was shrunk by 2MB 00:05:20.043 EAL: No shared files mode enabled, IPC is disabled 00:05:20.043 EAL: No shared files mode enabled, IPC is disabled 00:05:20.043 EAL: No shared files mode enabled, IPC is disabled 00:05:20.043 00:05:20.043 real 0m1.082s 00:05:20.043 user 0m0.626s 00:05:20.043 sys 0m0.429s 00:05:20.043 04:48:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:20.043 04:48:55 -- common/autotest_common.sh@10 -- # set +x 00:05:20.043 ************************************ 00:05:20.043 END TEST env_vtophys 00:05:20.043 ************************************ 00:05:20.301 04:48:55 -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:20.302 04:48:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:20.302 04:48:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:20.302 04:48:55 -- common/autotest_common.sh@10 -- # set +x 00:05:20.302 ************************************ 00:05:20.302 START TEST env_pci 00:05:20.302 ************************************ 00:05:20.302 04:48:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/pci/pci_ut 00:05:20.302 00:05:20.302 00:05:20.302 CUnit - A unit testing framework for C - Version 2.1-3 00:05:20.302 http://cunit.sourceforge.net/ 00:05:20.302 00:05:20.302 00:05:20.302 Suite: pci 00:05:20.302 Test: pci_hook ...[2024-11-08 04:48:55.199345] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk/pci.c:1041:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 3654028 has claimed it 00:05:20.302 EAL: Cannot find device (10000:00:01.0) 00:05:20.302 EAL: Failed to attach device on primary process 00:05:20.302 passed 00:05:20.302 00:05:20.302 Run Summary: Type Total Ran Passed Failed Inactive 00:05:20.302 suites 1 1 n/a 0 0 00:05:20.302 tests 1 1 1 0 0 00:05:20.302 asserts 25 25 25 0 n/a 00:05:20.302 00:05:20.302 Elapsed time = 0.037 seconds 00:05:20.302 00:05:20.302 real 0m0.057s 00:05:20.302 user 0m0.014s 00:05:20.302 sys 0m0.043s 00:05:20.302 04:48:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:20.302 04:48:55 -- common/autotest_common.sh@10 -- # set +x 00:05:20.302 ************************************ 00:05:20.302 END TEST env_pci 00:05:20.302 ************************************ 00:05:20.302 04:48:55 -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:20.302 04:48:55 -- env/env.sh@15 -- # uname 00:05:20.302 04:48:55 -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:20.302 04:48:55 -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:20.302 04:48:55 -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:20.302 04:48:55 -- common/autotest_common.sh@1087 -- # '[' 5 -le 1 ']' 00:05:20.302 04:48:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:20.302 04:48:55 -- common/autotest_common.sh@10 -- # set +x 00:05:20.302 ************************************ 00:05:20.302 START TEST env_dpdk_post_init 00:05:20.302 ************************************ 00:05:20.302 04:48:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:20.302 EAL: Detected CPU lcores: 112 00:05:20.302 EAL: Detected NUMA nodes: 2 00:05:20.302 EAL: Detected static linkage of DPDK 00:05:20.302 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:20.302 EAL: Selected IOVA mode 'VA' 00:05:20.302 EAL: No free 2048 kB hugepages reported on node 1 00:05:20.302 EAL: VFIO support initialized 00:05:20.302 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:20.561 EAL: Using IOMMU type 1 (Type 1) 00:05:21.128 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:d8:00.0 (socket 1) 00:05:25.311 EAL: Releasing PCI mapped resource for 0000:d8:00.0 00:05:25.311 EAL: Calling pci_unmap_resource for 0000:d8:00.0 at 0x202001000000 00:05:25.311 Starting DPDK initialization... 00:05:25.311 Starting SPDK post initialization... 00:05:25.311 SPDK NVMe probe 00:05:25.311 Attaching to 0000:d8:00.0 00:05:25.311 Attached to 0000:d8:00.0 00:05:25.311 Cleaning up... 00:05:25.311 00:05:25.311 real 0m4.702s 00:05:25.311 user 0m3.543s 00:05:25.311 sys 0m0.402s 00:05:25.311 04:48:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:25.312 04:48:59 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 ************************************ 00:05:25.312 END TEST env_dpdk_post_init 00:05:25.312 ************************************ 00:05:25.312 04:49:00 -- env/env.sh@26 -- # uname 00:05:25.312 04:49:00 -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:25.312 04:49:00 -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:25.312 04:49:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:25.312 04:49:00 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 ************************************ 00:05:25.312 START TEST env_mem_callbacks 00:05:25.312 ************************************ 00:05:25.312 04:49:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:25.312 EAL: Detected CPU lcores: 112 00:05:25.312 EAL: Detected NUMA nodes: 2 00:05:25.312 EAL: Detected static linkage of DPDK 00:05:25.312 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:25.312 EAL: Selected IOVA mode 'VA' 00:05:25.312 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.312 EAL: VFIO support initialized 00:05:25.312 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:25.312 00:05:25.312 00:05:25.312 CUnit - A unit testing framework for C - Version 2.1-3 00:05:25.312 http://cunit.sourceforge.net/ 00:05:25.312 00:05:25.312 00:05:25.312 Suite: memory 00:05:25.312 Test: test ... 00:05:25.312 register 0x200000200000 2097152 00:05:25.312 malloc 3145728 00:05:25.312 register 0x200000400000 4194304 00:05:25.312 buf 0x200000500000 len 3145728 PASSED 00:05:25.312 malloc 64 00:05:25.312 buf 0x2000004fff40 len 64 PASSED 00:05:25.312 malloc 4194304 00:05:25.312 register 0x200000800000 6291456 00:05:25.312 buf 0x200000a00000 len 4194304 PASSED 00:05:25.312 free 0x200000500000 3145728 00:05:25.312 free 0x2000004fff40 64 00:05:25.312 unregister 0x200000400000 4194304 PASSED 00:05:25.312 free 0x200000a00000 4194304 00:05:25.312 unregister 0x200000800000 6291456 PASSED 00:05:25.312 malloc 8388608 00:05:25.312 register 0x200000400000 10485760 00:05:25.312 buf 0x200000600000 len 8388608 PASSED 00:05:25.312 free 0x200000600000 8388608 00:05:25.312 unregister 0x200000400000 10485760 PASSED 00:05:25.312 passed 00:05:25.312 00:05:25.312 Run Summary: Type Total Ran Passed Failed Inactive 00:05:25.312 suites 1 1 n/a 0 0 00:05:25.312 tests 1 1 1 0 0 00:05:25.312 asserts 15 15 15 0 n/a 00:05:25.312 00:05:25.312 Elapsed time = 0.005 seconds 00:05:25.312 00:05:25.312 real 0m0.064s 00:05:25.312 user 0m0.015s 00:05:25.312 sys 0m0.048s 00:05:25.312 04:49:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:25.312 04:49:00 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 ************************************ 00:05:25.312 END TEST env_mem_callbacks 00:05:25.312 ************************************ 00:05:25.312 00:05:25.312 real 0m6.445s 00:05:25.312 user 0m4.459s 00:05:25.312 sys 0m1.259s 00:05:25.312 04:49:00 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:25.312 04:49:00 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 ************************************ 00:05:25.312 END TEST env 00:05:25.312 ************************************ 00:05:25.312 04:49:00 -- spdk/autotest.sh@163 -- # run_test rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:25.312 04:49:00 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:25.312 04:49:00 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 ************************************ 00:05:25.312 START TEST rpc 00:05:25.312 ************************************ 00:05:25.312 04:49:00 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc/rpc.sh 00:05:25.312 * Looking for test storage... 00:05:25.312 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:25.312 04:49:00 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:25.312 04:49:00 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:25.312 04:49:00 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:25.312 04:49:00 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:25.312 04:49:00 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:25.312 04:49:00 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:25.312 04:49:00 -- scripts/common.sh@335 -- # IFS=.-: 00:05:25.312 04:49:00 -- scripts/common.sh@335 -- # read -ra ver1 00:05:25.312 04:49:00 -- scripts/common.sh@336 -- # IFS=.-: 00:05:25.312 04:49:00 -- scripts/common.sh@336 -- # read -ra ver2 00:05:25.312 04:49:00 -- scripts/common.sh@337 -- # local 'op=<' 00:05:25.312 04:49:00 -- scripts/common.sh@339 -- # ver1_l=2 00:05:25.312 04:49:00 -- scripts/common.sh@340 -- # ver2_l=1 00:05:25.312 04:49:00 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:25.312 04:49:00 -- scripts/common.sh@343 -- # case "$op" in 00:05:25.312 04:49:00 -- scripts/common.sh@344 -- # : 1 00:05:25.312 04:49:00 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:25.312 04:49:00 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:25.312 04:49:00 -- scripts/common.sh@364 -- # decimal 1 00:05:25.312 04:49:00 -- scripts/common.sh@352 -- # local d=1 00:05:25.312 04:49:00 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:25.312 04:49:00 -- scripts/common.sh@354 -- # echo 1 00:05:25.312 04:49:00 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:25.312 04:49:00 -- scripts/common.sh@365 -- # decimal 2 00:05:25.312 04:49:00 -- scripts/common.sh@352 -- # local d=2 00:05:25.312 04:49:00 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:25.312 04:49:00 -- scripts/common.sh@354 -- # echo 2 00:05:25.312 04:49:00 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:25.312 04:49:00 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:25.312 04:49:00 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:25.312 04:49:00 -- scripts/common.sh@367 -- # return 0 00:05:25.312 04:49:00 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:25.312 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:25.312 --rc genhtml_branch_coverage=1 00:05:25.312 --rc genhtml_function_coverage=1 00:05:25.312 --rc genhtml_legend=1 00:05:25.312 --rc geninfo_all_blocks=1 00:05:25.312 --rc geninfo_unexecuted_blocks=1 00:05:25.312 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:25.312 ' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:25.312 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:25.312 --rc genhtml_branch_coverage=1 00:05:25.312 --rc genhtml_function_coverage=1 00:05:25.312 --rc genhtml_legend=1 00:05:25.312 --rc geninfo_all_blocks=1 00:05:25.312 --rc geninfo_unexecuted_blocks=1 00:05:25.312 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:25.312 ' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:25.312 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:25.312 --rc genhtml_branch_coverage=1 00:05:25.312 --rc genhtml_function_coverage=1 00:05:25.312 --rc genhtml_legend=1 00:05:25.312 --rc geninfo_all_blocks=1 00:05:25.312 --rc geninfo_unexecuted_blocks=1 00:05:25.312 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:25.312 ' 00:05:25.312 04:49:00 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:25.312 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:25.312 --rc genhtml_branch_coverage=1 00:05:25.312 --rc genhtml_function_coverage=1 00:05:25.312 --rc genhtml_legend=1 00:05:25.312 --rc geninfo_all_blocks=1 00:05:25.312 --rc geninfo_unexecuted_blocks=1 00:05:25.312 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:25.312 ' 00:05:25.312 04:49:00 -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:25.312 04:49:00 -- rpc/rpc.sh@65 -- # spdk_pid=3655000 00:05:25.312 04:49:00 -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:25.312 04:49:00 -- rpc/rpc.sh@67 -- # waitforlisten 3655000 00:05:25.312 04:49:00 -- common/autotest_common.sh@829 -- # '[' -z 3655000 ']' 00:05:25.312 04:49:00 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:25.312 04:49:00 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:25.312 04:49:00 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:25.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:25.312 04:49:00 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:25.312 04:49:00 -- common/autotest_common.sh@10 -- # set +x 00:05:25.312 [2024-11-08 04:49:00.390217] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:25.312 [2024-11-08 04:49:00.390274] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3655000 ] 00:05:25.571 EAL: No free 2048 kB hugepages reported on node 1 00:05:25.571 [2024-11-08 04:49:00.456056] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.571 [2024-11-08 04:49:00.525763] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:25.571 [2024-11-08 04:49:00.525865] app.c: 488:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:25.571 [2024-11-08 04:49:00.525876] app.c: 492:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 3655000' to capture a snapshot of events at runtime. 00:05:25.571 [2024-11-08 04:49:00.525885] app.c: 494:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid3655000 for offline analysis/debug. 00:05:25.571 [2024-11-08 04:49:00.525904] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.138 04:49:01 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:26.138 04:49:01 -- common/autotest_common.sh@862 -- # return 0 00:05:26.138 04:49:01 -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:26.138 04:49:01 -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc 00:05:26.138 04:49:01 -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:26.138 04:49:01 -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:26.138 04:49:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.138 04:49:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.138 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.138 ************************************ 00:05:26.138 START TEST rpc_integrity 00:05:26.138 ************************************ 00:05:26.138 04:49:01 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:26.138 04:49:01 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:26.138 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.138 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.138 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.138 04:49:01 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:26.138 04:49:01 -- rpc/rpc.sh@13 -- # jq length 00:05:26.397 04:49:01 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:26.397 04:49:01 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:26.397 04:49:01 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:26.397 { 00:05:26.397 "name": "Malloc0", 00:05:26.397 "aliases": [ 00:05:26.397 "cd4375dc-60fd-4121-ab3d-86ff9f6874af" 00:05:26.397 ], 00:05:26.397 "product_name": "Malloc disk", 00:05:26.397 "block_size": 512, 00:05:26.397 "num_blocks": 16384, 00:05:26.397 "uuid": "cd4375dc-60fd-4121-ab3d-86ff9f6874af", 00:05:26.397 "assigned_rate_limits": { 00:05:26.397 "rw_ios_per_sec": 0, 00:05:26.397 "rw_mbytes_per_sec": 0, 00:05:26.397 "r_mbytes_per_sec": 0, 00:05:26.397 "w_mbytes_per_sec": 0 00:05:26.397 }, 00:05:26.397 "claimed": false, 00:05:26.397 "zoned": false, 00:05:26.397 "supported_io_types": { 00:05:26.397 "read": true, 00:05:26.397 "write": true, 00:05:26.397 "unmap": true, 00:05:26.397 "write_zeroes": true, 00:05:26.397 "flush": true, 00:05:26.397 "reset": true, 00:05:26.397 "compare": false, 00:05:26.397 "compare_and_write": false, 00:05:26.397 "abort": true, 00:05:26.397 "nvme_admin": false, 00:05:26.397 "nvme_io": false 00:05:26.397 }, 00:05:26.397 "memory_domains": [ 00:05:26.397 { 00:05:26.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:26.397 "dma_device_type": 2 00:05:26.397 } 00:05:26.397 ], 00:05:26.397 "driver_specific": {} 00:05:26.397 } 00:05:26.397 ]' 00:05:26.397 04:49:01 -- rpc/rpc.sh@17 -- # jq length 00:05:26.397 04:49:01 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:26.397 04:49:01 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 [2024-11-08 04:49:01.351779] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:26.397 [2024-11-08 04:49:01.351813] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:26.397 [2024-11-08 04:49:01.351833] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5c7d030 00:05:26.397 [2024-11-08 04:49:01.351845] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:26.397 [2024-11-08 04:49:01.352672] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:26.397 [2024-11-08 04:49:01.352695] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:26.397 Passthru0 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:26.397 { 00:05:26.397 "name": "Malloc0", 00:05:26.397 "aliases": [ 00:05:26.397 "cd4375dc-60fd-4121-ab3d-86ff9f6874af" 00:05:26.397 ], 00:05:26.397 "product_name": "Malloc disk", 00:05:26.397 "block_size": 512, 00:05:26.397 "num_blocks": 16384, 00:05:26.397 "uuid": "cd4375dc-60fd-4121-ab3d-86ff9f6874af", 00:05:26.397 "assigned_rate_limits": { 00:05:26.397 "rw_ios_per_sec": 0, 00:05:26.397 "rw_mbytes_per_sec": 0, 00:05:26.397 "r_mbytes_per_sec": 0, 00:05:26.397 "w_mbytes_per_sec": 0 00:05:26.397 }, 00:05:26.397 "claimed": true, 00:05:26.397 "claim_type": "exclusive_write", 00:05:26.397 "zoned": false, 00:05:26.397 "supported_io_types": { 00:05:26.397 "read": true, 00:05:26.397 "write": true, 00:05:26.397 "unmap": true, 00:05:26.397 "write_zeroes": true, 00:05:26.397 "flush": true, 00:05:26.397 "reset": true, 00:05:26.397 "compare": false, 00:05:26.397 "compare_and_write": false, 00:05:26.397 "abort": true, 00:05:26.397 "nvme_admin": false, 00:05:26.397 "nvme_io": false 00:05:26.397 }, 00:05:26.397 "memory_domains": [ 00:05:26.397 { 00:05:26.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:26.397 "dma_device_type": 2 00:05:26.397 } 00:05:26.397 ], 00:05:26.397 "driver_specific": {} 00:05:26.397 }, 00:05:26.397 { 00:05:26.397 "name": "Passthru0", 00:05:26.397 "aliases": [ 00:05:26.397 "25575e60-0d5b-55fe-87b6-dc7d41dcb94b" 00:05:26.397 ], 00:05:26.397 "product_name": "passthru", 00:05:26.397 "block_size": 512, 00:05:26.397 "num_blocks": 16384, 00:05:26.397 "uuid": "25575e60-0d5b-55fe-87b6-dc7d41dcb94b", 00:05:26.397 "assigned_rate_limits": { 00:05:26.397 "rw_ios_per_sec": 0, 00:05:26.397 "rw_mbytes_per_sec": 0, 00:05:26.397 "r_mbytes_per_sec": 0, 00:05:26.397 "w_mbytes_per_sec": 0 00:05:26.397 }, 00:05:26.397 "claimed": false, 00:05:26.397 "zoned": false, 00:05:26.397 "supported_io_types": { 00:05:26.397 "read": true, 00:05:26.397 "write": true, 00:05:26.397 "unmap": true, 00:05:26.397 "write_zeroes": true, 00:05:26.397 "flush": true, 00:05:26.397 "reset": true, 00:05:26.397 "compare": false, 00:05:26.397 "compare_and_write": false, 00:05:26.397 "abort": true, 00:05:26.397 "nvme_admin": false, 00:05:26.397 "nvme_io": false 00:05:26.397 }, 00:05:26.397 "memory_domains": [ 00:05:26.397 { 00:05:26.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:26.397 "dma_device_type": 2 00:05:26.397 } 00:05:26.397 ], 00:05:26.397 "driver_specific": { 00:05:26.397 "passthru": { 00:05:26.397 "name": "Passthru0", 00:05:26.397 "base_bdev_name": "Malloc0" 00:05:26.397 } 00:05:26.397 } 00:05:26.397 } 00:05:26.397 ]' 00:05:26.397 04:49:01 -- rpc/rpc.sh@21 -- # jq length 00:05:26.397 04:49:01 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:26.397 04:49:01 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:26.397 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.397 04:49:01 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:26.397 04:49:01 -- rpc/rpc.sh@26 -- # jq length 00:05:26.397 04:49:01 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:26.397 00:05:26.397 real 0m0.251s 00:05:26.397 user 0m0.148s 00:05:26.397 sys 0m0.043s 00:05:26.397 04:49:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:26.397 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.397 ************************************ 00:05:26.397 END TEST rpc_integrity 00:05:26.397 ************************************ 00:05:26.656 04:49:01 -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:26.656 04:49:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.656 04:49:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 ************************************ 00:05:26.656 START TEST rpc_plugins 00:05:26.656 ************************************ 00:05:26.656 04:49:01 -- common/autotest_common.sh@1114 -- # rpc_plugins 00:05:26.656 04:49:01 -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:26.656 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.656 04:49:01 -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:26.656 04:49:01 -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:26.656 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.656 04:49:01 -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:26.656 { 00:05:26.656 "name": "Malloc1", 00:05:26.656 "aliases": [ 00:05:26.656 "013cbcd7-c8f9-4995-af92-861788d0ba1a" 00:05:26.656 ], 00:05:26.656 "product_name": "Malloc disk", 00:05:26.656 "block_size": 4096, 00:05:26.656 "num_blocks": 256, 00:05:26.656 "uuid": "013cbcd7-c8f9-4995-af92-861788d0ba1a", 00:05:26.656 "assigned_rate_limits": { 00:05:26.656 "rw_ios_per_sec": 0, 00:05:26.656 "rw_mbytes_per_sec": 0, 00:05:26.656 "r_mbytes_per_sec": 0, 00:05:26.656 "w_mbytes_per_sec": 0 00:05:26.656 }, 00:05:26.656 "claimed": false, 00:05:26.656 "zoned": false, 00:05:26.656 "supported_io_types": { 00:05:26.656 "read": true, 00:05:26.656 "write": true, 00:05:26.656 "unmap": true, 00:05:26.656 "write_zeroes": true, 00:05:26.656 "flush": true, 00:05:26.656 "reset": true, 00:05:26.656 "compare": false, 00:05:26.656 "compare_and_write": false, 00:05:26.656 "abort": true, 00:05:26.656 "nvme_admin": false, 00:05:26.656 "nvme_io": false 00:05:26.656 }, 00:05:26.656 "memory_domains": [ 00:05:26.656 { 00:05:26.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:26.656 "dma_device_type": 2 00:05:26.656 } 00:05:26.656 ], 00:05:26.656 "driver_specific": {} 00:05:26.656 } 00:05:26.656 ]' 00:05:26.656 04:49:01 -- rpc/rpc.sh@32 -- # jq length 00:05:26.656 04:49:01 -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:26.656 04:49:01 -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:26.656 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.656 04:49:01 -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:26.656 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.656 04:49:01 -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:26.656 04:49:01 -- rpc/rpc.sh@36 -- # jq length 00:05:26.656 04:49:01 -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:26.656 00:05:26.656 real 0m0.138s 00:05:26.656 user 0m0.084s 00:05:26.656 sys 0m0.020s 00:05:26.656 04:49:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 ************************************ 00:05:26.656 END TEST rpc_plugins 00:05:26.656 ************************************ 00:05:26.656 04:49:01 -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:26.656 04:49:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.656 04:49:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 ************************************ 00:05:26.656 START TEST rpc_trace_cmd_test 00:05:26.656 ************************************ 00:05:26.656 04:49:01 -- common/autotest_common.sh@1114 -- # rpc_trace_cmd_test 00:05:26.656 04:49:01 -- rpc/rpc.sh@40 -- # local info 00:05:26.656 04:49:01 -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:26.656 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.656 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.656 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.656 04:49:01 -- rpc/rpc.sh@42 -- # info='{ 00:05:26.656 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid3655000", 00:05:26.656 "tpoint_group_mask": "0x8", 00:05:26.656 "iscsi_conn": { 00:05:26.656 "mask": "0x2", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "scsi": { 00:05:26.656 "mask": "0x4", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "bdev": { 00:05:26.656 "mask": "0x8", 00:05:26.656 "tpoint_mask": "0xffffffffffffffff" 00:05:26.656 }, 00:05:26.656 "nvmf_rdma": { 00:05:26.656 "mask": "0x10", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "nvmf_tcp": { 00:05:26.656 "mask": "0x20", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "ftl": { 00:05:26.656 "mask": "0x40", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "blobfs": { 00:05:26.656 "mask": "0x80", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "dsa": { 00:05:26.656 "mask": "0x200", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "thread": { 00:05:26.656 "mask": "0x400", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "nvme_pcie": { 00:05:26.656 "mask": "0x800", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "iaa": { 00:05:26.656 "mask": "0x1000", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "nvme_tcp": { 00:05:26.656 "mask": "0x2000", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 }, 00:05:26.656 "bdev_nvme": { 00:05:26.656 "mask": "0x4000", 00:05:26.656 "tpoint_mask": "0x0" 00:05:26.656 } 00:05:26.656 }' 00:05:26.656 04:49:01 -- rpc/rpc.sh@43 -- # jq length 00:05:26.915 04:49:01 -- rpc/rpc.sh@43 -- # '[' 15 -gt 2 ']' 00:05:26.915 04:49:01 -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:26.915 04:49:01 -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:26.915 04:49:01 -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:26.915 04:49:01 -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:26.915 04:49:01 -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:26.915 04:49:01 -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:26.915 04:49:01 -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:26.915 04:49:01 -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:26.915 00:05:26.915 real 0m0.210s 00:05:26.915 user 0m0.178s 00:05:26.915 sys 0m0.025s 00:05:26.915 04:49:01 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:26.915 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.915 ************************************ 00:05:26.915 END TEST rpc_trace_cmd_test 00:05:26.915 ************************************ 00:05:26.915 04:49:01 -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:26.915 04:49:01 -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:26.915 04:49:01 -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:26.915 04:49:01 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:26.915 04:49:01 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:26.915 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.915 ************************************ 00:05:26.915 START TEST rpc_daemon_integrity 00:05:26.915 ************************************ 00:05:26.915 04:49:01 -- common/autotest_common.sh@1114 -- # rpc_integrity 00:05:26.915 04:49:01 -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:26.915 04:49:01 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.915 04:49:01 -- common/autotest_common.sh@10 -- # set +x 00:05:26.915 04:49:01 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.915 04:49:01 -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:26.915 04:49:01 -- rpc/rpc.sh@13 -- # jq length 00:05:26.915 04:49:02 -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:26.915 04:49:02 -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:26.915 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.915 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:26.915 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:26.915 04:49:02 -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:26.915 04:49:02 -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:26.915 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:26.915 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:27.173 { 00:05:27.173 "name": "Malloc2", 00:05:27.173 "aliases": [ 00:05:27.173 "9a294c59-9cc9-4b61-8c01-818ae779f55c" 00:05:27.173 ], 00:05:27.173 "product_name": "Malloc disk", 00:05:27.173 "block_size": 512, 00:05:27.173 "num_blocks": 16384, 00:05:27.173 "uuid": "9a294c59-9cc9-4b61-8c01-818ae779f55c", 00:05:27.173 "assigned_rate_limits": { 00:05:27.173 "rw_ios_per_sec": 0, 00:05:27.173 "rw_mbytes_per_sec": 0, 00:05:27.173 "r_mbytes_per_sec": 0, 00:05:27.173 "w_mbytes_per_sec": 0 00:05:27.173 }, 00:05:27.173 "claimed": false, 00:05:27.173 "zoned": false, 00:05:27.173 "supported_io_types": { 00:05:27.173 "read": true, 00:05:27.173 "write": true, 00:05:27.173 "unmap": true, 00:05:27.173 "write_zeroes": true, 00:05:27.173 "flush": true, 00:05:27.173 "reset": true, 00:05:27.173 "compare": false, 00:05:27.173 "compare_and_write": false, 00:05:27.173 "abort": true, 00:05:27.173 "nvme_admin": false, 00:05:27.173 "nvme_io": false 00:05:27.173 }, 00:05:27.173 "memory_domains": [ 00:05:27.173 { 00:05:27.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.173 "dma_device_type": 2 00:05:27.173 } 00:05:27.173 ], 00:05:27.173 "driver_specific": {} 00:05:27.173 } 00:05:27.173 ]' 00:05:27.173 04:49:02 -- rpc/rpc.sh@17 -- # jq length 00:05:27.173 04:49:02 -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:27.173 04:49:02 -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:27.173 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 [2024-11-08 04:49:02.085694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:27.173 [2024-11-08 04:49:02.085725] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:27.173 [2024-11-08 04:49:02.085740] vbdev_passthru.c: 676:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x5e06980 00:05:27.173 [2024-11-08 04:49:02.085751] vbdev_passthru.c: 691:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:27.173 [2024-11-08 04:49:02.086447] vbdev_passthru.c: 704:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:27.173 [2024-11-08 04:49:02.086468] vbdev_passthru.c: 705:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:27.173 Passthru0 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:27.173 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:27.173 { 00:05:27.173 "name": "Malloc2", 00:05:27.173 "aliases": [ 00:05:27.173 "9a294c59-9cc9-4b61-8c01-818ae779f55c" 00:05:27.173 ], 00:05:27.173 "product_name": "Malloc disk", 00:05:27.173 "block_size": 512, 00:05:27.173 "num_blocks": 16384, 00:05:27.173 "uuid": "9a294c59-9cc9-4b61-8c01-818ae779f55c", 00:05:27.173 "assigned_rate_limits": { 00:05:27.173 "rw_ios_per_sec": 0, 00:05:27.173 "rw_mbytes_per_sec": 0, 00:05:27.173 "r_mbytes_per_sec": 0, 00:05:27.173 "w_mbytes_per_sec": 0 00:05:27.173 }, 00:05:27.173 "claimed": true, 00:05:27.173 "claim_type": "exclusive_write", 00:05:27.173 "zoned": false, 00:05:27.173 "supported_io_types": { 00:05:27.173 "read": true, 00:05:27.173 "write": true, 00:05:27.173 "unmap": true, 00:05:27.173 "write_zeroes": true, 00:05:27.173 "flush": true, 00:05:27.173 "reset": true, 00:05:27.173 "compare": false, 00:05:27.173 "compare_and_write": false, 00:05:27.173 "abort": true, 00:05:27.173 "nvme_admin": false, 00:05:27.173 "nvme_io": false 00:05:27.173 }, 00:05:27.173 "memory_domains": [ 00:05:27.173 { 00:05:27.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.173 "dma_device_type": 2 00:05:27.173 } 00:05:27.173 ], 00:05:27.173 "driver_specific": {} 00:05:27.173 }, 00:05:27.173 { 00:05:27.173 "name": "Passthru0", 00:05:27.173 "aliases": [ 00:05:27.173 "eca5729d-7649-5813-957d-8306ada8b231" 00:05:27.173 ], 00:05:27.173 "product_name": "passthru", 00:05:27.173 "block_size": 512, 00:05:27.173 "num_blocks": 16384, 00:05:27.173 "uuid": "eca5729d-7649-5813-957d-8306ada8b231", 00:05:27.173 "assigned_rate_limits": { 00:05:27.173 "rw_ios_per_sec": 0, 00:05:27.173 "rw_mbytes_per_sec": 0, 00:05:27.173 "r_mbytes_per_sec": 0, 00:05:27.173 "w_mbytes_per_sec": 0 00:05:27.173 }, 00:05:27.173 "claimed": false, 00:05:27.173 "zoned": false, 00:05:27.173 "supported_io_types": { 00:05:27.173 "read": true, 00:05:27.173 "write": true, 00:05:27.173 "unmap": true, 00:05:27.173 "write_zeroes": true, 00:05:27.173 "flush": true, 00:05:27.173 "reset": true, 00:05:27.173 "compare": false, 00:05:27.173 "compare_and_write": false, 00:05:27.173 "abort": true, 00:05:27.173 "nvme_admin": false, 00:05:27.173 "nvme_io": false 00:05:27.173 }, 00:05:27.173 "memory_domains": [ 00:05:27.173 { 00:05:27.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:27.173 "dma_device_type": 2 00:05:27.173 } 00:05:27.173 ], 00:05:27.173 "driver_specific": { 00:05:27.173 "passthru": { 00:05:27.173 "name": "Passthru0", 00:05:27.173 "base_bdev_name": "Malloc2" 00:05:27.173 } 00:05:27.173 } 00:05:27.173 } 00:05:27.173 ]' 00:05:27.173 04:49:02 -- rpc/rpc.sh@21 -- # jq length 00:05:27.173 04:49:02 -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:27.173 04:49:02 -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:27.173 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:27.173 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:27.173 04:49:02 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 04:49:02 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:27.173 04:49:02 -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:27.173 04:49:02 -- rpc/rpc.sh@26 -- # jq length 00:05:27.173 04:49:02 -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:27.173 00:05:27.173 real 0m0.263s 00:05:27.173 user 0m0.166s 00:05:27.173 sys 0m0.036s 00:05:27.173 04:49:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.173 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.173 ************************************ 00:05:27.173 END TEST rpc_daemon_integrity 00:05:27.173 ************************************ 00:05:27.173 04:49:02 -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:27.173 04:49:02 -- rpc/rpc.sh@84 -- # killprocess 3655000 00:05:27.173 04:49:02 -- common/autotest_common.sh@936 -- # '[' -z 3655000 ']' 00:05:27.173 04:49:02 -- common/autotest_common.sh@940 -- # kill -0 3655000 00:05:27.173 04:49:02 -- common/autotest_common.sh@941 -- # uname 00:05:27.173 04:49:02 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:27.431 04:49:02 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3655000 00:05:27.431 04:49:02 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:27.431 04:49:02 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:27.431 04:49:02 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3655000' 00:05:27.431 killing process with pid 3655000 00:05:27.431 04:49:02 -- common/autotest_common.sh@955 -- # kill 3655000 00:05:27.431 04:49:02 -- common/autotest_common.sh@960 -- # wait 3655000 00:05:27.689 00:05:27.689 real 0m2.429s 00:05:27.689 user 0m3.025s 00:05:27.689 sys 0m0.720s 00:05:27.689 04:49:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.689 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.689 ************************************ 00:05:27.689 END TEST rpc 00:05:27.689 ************************************ 00:05:27.689 04:49:02 -- spdk/autotest.sh@164 -- # run_test rpc_client /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:27.689 04:49:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.689 04:49:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.690 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.690 ************************************ 00:05:27.690 START TEST rpc_client 00:05:27.690 ************************************ 00:05:27.690 04:49:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:27.690 * Looking for test storage... 00:05:27.690 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client 00:05:27.690 04:49:02 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:27.690 04:49:02 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:27.690 04:49:02 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:27.949 04:49:02 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:27.949 04:49:02 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:27.949 04:49:02 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:27.949 04:49:02 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:27.949 04:49:02 -- scripts/common.sh@335 -- # IFS=.-: 00:05:27.949 04:49:02 -- scripts/common.sh@335 -- # read -ra ver1 00:05:27.949 04:49:02 -- scripts/common.sh@336 -- # IFS=.-: 00:05:27.949 04:49:02 -- scripts/common.sh@336 -- # read -ra ver2 00:05:27.949 04:49:02 -- scripts/common.sh@337 -- # local 'op=<' 00:05:27.949 04:49:02 -- scripts/common.sh@339 -- # ver1_l=2 00:05:27.949 04:49:02 -- scripts/common.sh@340 -- # ver2_l=1 00:05:27.949 04:49:02 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:27.949 04:49:02 -- scripts/common.sh@343 -- # case "$op" in 00:05:27.949 04:49:02 -- scripts/common.sh@344 -- # : 1 00:05:27.949 04:49:02 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:27.949 04:49:02 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:27.949 04:49:02 -- scripts/common.sh@364 -- # decimal 1 00:05:27.949 04:49:02 -- scripts/common.sh@352 -- # local d=1 00:05:27.949 04:49:02 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:27.949 04:49:02 -- scripts/common.sh@354 -- # echo 1 00:05:27.949 04:49:02 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:27.949 04:49:02 -- scripts/common.sh@365 -- # decimal 2 00:05:27.949 04:49:02 -- scripts/common.sh@352 -- # local d=2 00:05:27.949 04:49:02 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:27.949 04:49:02 -- scripts/common.sh@354 -- # echo 2 00:05:27.949 04:49:02 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:27.949 04:49:02 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:27.949 04:49:02 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:27.949 04:49:02 -- scripts/common.sh@367 -- # return 0 00:05:27.949 04:49:02 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:27.949 04:49:02 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:27.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.949 --rc genhtml_branch_coverage=1 00:05:27.949 --rc genhtml_function_coverage=1 00:05:27.949 --rc genhtml_legend=1 00:05:27.949 --rc geninfo_all_blocks=1 00:05:27.949 --rc geninfo_unexecuted_blocks=1 00:05:27.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.949 ' 00:05:27.949 04:49:02 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:27.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.949 --rc genhtml_branch_coverage=1 00:05:27.949 --rc genhtml_function_coverage=1 00:05:27.949 --rc genhtml_legend=1 00:05:27.949 --rc geninfo_all_blocks=1 00:05:27.949 --rc geninfo_unexecuted_blocks=1 00:05:27.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.949 ' 00:05:27.949 04:49:02 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:27.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.949 --rc genhtml_branch_coverage=1 00:05:27.949 --rc genhtml_function_coverage=1 00:05:27.949 --rc genhtml_legend=1 00:05:27.949 --rc geninfo_all_blocks=1 00:05:27.949 --rc geninfo_unexecuted_blocks=1 00:05:27.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.949 ' 00:05:27.949 04:49:02 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:27.949 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.949 --rc genhtml_branch_coverage=1 00:05:27.949 --rc genhtml_function_coverage=1 00:05:27.949 --rc genhtml_legend=1 00:05:27.949 --rc geninfo_all_blocks=1 00:05:27.949 --rc geninfo_unexecuted_blocks=1 00:05:27.949 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:27.949 ' 00:05:27.949 04:49:02 -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:27.949 OK 00:05:27.949 04:49:02 -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:27.949 00:05:27.949 real 0m0.204s 00:05:27.949 user 0m0.119s 00:05:27.949 sys 0m0.097s 00:05:27.949 04:49:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:27.949 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.949 ************************************ 00:05:27.949 END TEST rpc_client 00:05:27.949 ************************************ 00:05:27.949 04:49:02 -- spdk/autotest.sh@165 -- # run_test json_config /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:27.949 04:49:02 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:27.949 04:49:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:27.949 04:49:02 -- common/autotest_common.sh@10 -- # set +x 00:05:27.949 ************************************ 00:05:27.949 START TEST json_config 00:05:27.949 ************************************ 00:05:27.949 04:49:02 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config.sh 00:05:27.949 04:49:03 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:27.949 04:49:03 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:27.949 04:49:03 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:28.207 04:49:03 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:28.207 04:49:03 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:28.207 04:49:03 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:28.207 04:49:03 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:28.207 04:49:03 -- scripts/common.sh@335 -- # IFS=.-: 00:05:28.207 04:49:03 -- scripts/common.sh@335 -- # read -ra ver1 00:05:28.207 04:49:03 -- scripts/common.sh@336 -- # IFS=.-: 00:05:28.207 04:49:03 -- scripts/common.sh@336 -- # read -ra ver2 00:05:28.207 04:49:03 -- scripts/common.sh@337 -- # local 'op=<' 00:05:28.207 04:49:03 -- scripts/common.sh@339 -- # ver1_l=2 00:05:28.207 04:49:03 -- scripts/common.sh@340 -- # ver2_l=1 00:05:28.207 04:49:03 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:28.207 04:49:03 -- scripts/common.sh@343 -- # case "$op" in 00:05:28.207 04:49:03 -- scripts/common.sh@344 -- # : 1 00:05:28.207 04:49:03 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:28.207 04:49:03 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:28.207 04:49:03 -- scripts/common.sh@364 -- # decimal 1 00:05:28.207 04:49:03 -- scripts/common.sh@352 -- # local d=1 00:05:28.207 04:49:03 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:28.207 04:49:03 -- scripts/common.sh@354 -- # echo 1 00:05:28.207 04:49:03 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:28.207 04:49:03 -- scripts/common.sh@365 -- # decimal 2 00:05:28.207 04:49:03 -- scripts/common.sh@352 -- # local d=2 00:05:28.207 04:49:03 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:28.207 04:49:03 -- scripts/common.sh@354 -- # echo 2 00:05:28.207 04:49:03 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:28.207 04:49:03 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:28.207 04:49:03 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:28.207 04:49:03 -- scripts/common.sh@367 -- # return 0 00:05:28.207 04:49:03 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:28.207 04:49:03 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:28.207 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.207 --rc genhtml_branch_coverage=1 00:05:28.207 --rc genhtml_function_coverage=1 00:05:28.207 --rc genhtml_legend=1 00:05:28.207 --rc geninfo_all_blocks=1 00:05:28.207 --rc geninfo_unexecuted_blocks=1 00:05:28.207 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.207 ' 00:05:28.207 04:49:03 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:28.207 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.207 --rc genhtml_branch_coverage=1 00:05:28.207 --rc genhtml_function_coverage=1 00:05:28.207 --rc genhtml_legend=1 00:05:28.207 --rc geninfo_all_blocks=1 00:05:28.207 --rc geninfo_unexecuted_blocks=1 00:05:28.207 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.207 ' 00:05:28.207 04:49:03 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:28.207 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.207 --rc genhtml_branch_coverage=1 00:05:28.207 --rc genhtml_function_coverage=1 00:05:28.207 --rc genhtml_legend=1 00:05:28.207 --rc geninfo_all_blocks=1 00:05:28.207 --rc geninfo_unexecuted_blocks=1 00:05:28.207 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.207 ' 00:05:28.207 04:49:03 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:28.207 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.207 --rc genhtml_branch_coverage=1 00:05:28.207 --rc genhtml_function_coverage=1 00:05:28.207 --rc genhtml_legend=1 00:05:28.207 --rc geninfo_all_blocks=1 00:05:28.207 --rc geninfo_unexecuted_blocks=1 00:05:28.207 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.207 ' 00:05:28.207 04:49:03 -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:28.207 04:49:03 -- nvmf/common.sh@7 -- # uname -s 00:05:28.207 04:49:03 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:28.207 04:49:03 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:28.207 04:49:03 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:28.207 04:49:03 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:28.207 04:49:03 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:28.207 04:49:03 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:28.207 04:49:03 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:28.207 04:49:03 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:28.207 04:49:03 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:28.207 04:49:03 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:28.207 04:49:03 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:28.207 04:49:03 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:28.207 04:49:03 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:28.207 04:49:03 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:28.207 04:49:03 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:28.207 04:49:03 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:28.207 04:49:03 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:28.207 04:49:03 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:28.207 04:49:03 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:28.207 04:49:03 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.207 04:49:03 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.207 04:49:03 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.207 04:49:03 -- paths/export.sh@5 -- # export PATH 00:05:28.207 04:49:03 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.207 04:49:03 -- nvmf/common.sh@46 -- # : 0 00:05:28.207 04:49:03 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:28.207 04:49:03 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:28.207 04:49:03 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:28.207 04:49:03 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:28.207 04:49:03 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:28.207 04:49:03 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:28.207 04:49:03 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:28.207 04:49:03 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:28.207 04:49:03 -- json_config/json_config.sh@10 -- # [[ 0 -eq 1 ]] 00:05:28.207 04:49:03 -- json_config/json_config.sh@14 -- # [[ 0 -ne 1 ]] 00:05:28.207 04:49:03 -- json_config/json_config.sh@14 -- # [[ 0 -eq 1 ]] 00:05:28.207 04:49:03 -- json_config/json_config.sh@25 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:28.207 04:49:03 -- json_config/json_config.sh@26 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:28.208 WARNING: No tests are enabled so not running JSON configuration tests 00:05:28.208 04:49:03 -- json_config/json_config.sh@27 -- # exit 0 00:05:28.208 00:05:28.208 real 0m0.156s 00:05:28.208 user 0m0.092s 00:05:28.208 sys 0m0.072s 00:05:28.208 04:49:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:28.208 04:49:03 -- common/autotest_common.sh@10 -- # set +x 00:05:28.208 ************************************ 00:05:28.208 END TEST json_config 00:05:28.208 ************************************ 00:05:28.208 04:49:03 -- spdk/autotest.sh@166 -- # run_test json_config_extra_key /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:28.208 04:49:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:28.208 04:49:03 -- common/autotest_common.sh@10 -- # set +x 00:05:28.208 ************************************ 00:05:28.208 START TEST json_config_extra_key 00:05:28.208 ************************************ 00:05:28.208 04:49:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:05:28.208 04:49:03 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:28.208 04:49:03 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:28.208 04:49:03 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:28.208 04:49:03 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:28.208 04:49:03 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:28.208 04:49:03 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:28.208 04:49:03 -- scripts/common.sh@335 -- # IFS=.-: 00:05:28.208 04:49:03 -- scripts/common.sh@335 -- # read -ra ver1 00:05:28.208 04:49:03 -- scripts/common.sh@336 -- # IFS=.-: 00:05:28.208 04:49:03 -- scripts/common.sh@336 -- # read -ra ver2 00:05:28.208 04:49:03 -- scripts/common.sh@337 -- # local 'op=<' 00:05:28.208 04:49:03 -- scripts/common.sh@339 -- # ver1_l=2 00:05:28.208 04:49:03 -- scripts/common.sh@340 -- # ver2_l=1 00:05:28.208 04:49:03 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:28.208 04:49:03 -- scripts/common.sh@343 -- # case "$op" in 00:05:28.208 04:49:03 -- scripts/common.sh@344 -- # : 1 00:05:28.208 04:49:03 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:28.208 04:49:03 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:28.208 04:49:03 -- scripts/common.sh@364 -- # decimal 1 00:05:28.208 04:49:03 -- scripts/common.sh@352 -- # local d=1 00:05:28.208 04:49:03 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:28.208 04:49:03 -- scripts/common.sh@354 -- # echo 1 00:05:28.208 04:49:03 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:28.208 04:49:03 -- scripts/common.sh@365 -- # decimal 2 00:05:28.208 04:49:03 -- scripts/common.sh@352 -- # local d=2 00:05:28.208 04:49:03 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:28.208 04:49:03 -- scripts/common.sh@354 -- # echo 2 00:05:28.208 04:49:03 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:28.208 04:49:03 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:28.208 04:49:03 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:28.208 04:49:03 -- scripts/common.sh@367 -- # return 0 00:05:28.208 04:49:03 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.208 --rc genhtml_branch_coverage=1 00:05:28.208 --rc genhtml_function_coverage=1 00:05:28.208 --rc genhtml_legend=1 00:05:28.208 --rc geninfo_all_blocks=1 00:05:28.208 --rc geninfo_unexecuted_blocks=1 00:05:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.208 ' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.208 --rc genhtml_branch_coverage=1 00:05:28.208 --rc genhtml_function_coverage=1 00:05:28.208 --rc genhtml_legend=1 00:05:28.208 --rc geninfo_all_blocks=1 00:05:28.208 --rc geninfo_unexecuted_blocks=1 00:05:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.208 ' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.208 --rc genhtml_branch_coverage=1 00:05:28.208 --rc genhtml_function_coverage=1 00:05:28.208 --rc genhtml_legend=1 00:05:28.208 --rc geninfo_all_blocks=1 00:05:28.208 --rc geninfo_unexecuted_blocks=1 00:05:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.208 ' 00:05:28.208 04:49:03 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:28.208 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:28.208 --rc genhtml_branch_coverage=1 00:05:28.208 --rc genhtml_function_coverage=1 00:05:28.208 --rc genhtml_legend=1 00:05:28.208 --rc geninfo_all_blocks=1 00:05:28.208 --rc geninfo_unexecuted_blocks=1 00:05:28.208 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:28.208 ' 00:05:28.208 04:49:03 -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/nvmf/common.sh 00:05:28.208 04:49:03 -- nvmf/common.sh@7 -- # uname -s 00:05:28.208 04:49:03 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:28.208 04:49:03 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:28.208 04:49:03 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:28.208 04:49:03 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:28.208 04:49:03 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:28.208 04:49:03 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:28.208 04:49:03 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:28.208 04:49:03 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:28.208 04:49:03 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:28.208 04:49:03 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:28.208 04:49:03 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:809b5fbc-4be7-e711-906e-0017a4403562 00:05:28.208 04:49:03 -- nvmf/common.sh@18 -- # NVME_HOSTID=809b5fbc-4be7-e711-906e-0017a4403562 00:05:28.208 04:49:03 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:28.208 04:49:03 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:28.208 04:49:03 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:28.208 04:49:03 -- nvmf/common.sh@44 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:05:28.466 04:49:03 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:28.466 04:49:03 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:28.466 04:49:03 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:28.466 04:49:03 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.466 04:49:03 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.466 04:49:03 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.466 04:49:03 -- paths/export.sh@5 -- # export PATH 00:05:28.466 04:49:03 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:28.466 04:49:03 -- nvmf/common.sh@46 -- # : 0 00:05:28.466 04:49:03 -- nvmf/common.sh@47 -- # export NVMF_APP_SHM_ID 00:05:28.466 04:49:03 -- nvmf/common.sh@48 -- # build_nvmf_app_args 00:05:28.467 04:49:03 -- nvmf/common.sh@24 -- # '[' 0 -eq 1 ']' 00:05:28.467 04:49:03 -- nvmf/common.sh@28 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:28.467 04:49:03 -- nvmf/common.sh@30 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:28.467 04:49:03 -- nvmf/common.sh@32 -- # '[' -n '' ']' 00:05:28.467 04:49:03 -- nvmf/common.sh@34 -- # '[' 0 -eq 1 ']' 00:05:28.467 04:49:03 -- nvmf/common.sh@50 -- # have_pci_nics=0 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@16 -- # app_pid=(['target']='') 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@16 -- # declare -A app_pid 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@17 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@17 -- # declare -A app_socket 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@18 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@18 -- # declare -A app_params 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@19 -- # configs_path=(['target']='/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json') 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@19 -- # declare -A configs_path 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@74 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@76 -- # echo 'INFO: launching applications...' 00:05:28.467 INFO: launching applications... 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@77 -- # json_config_test_start_app target --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@24 -- # local app=target 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@25 -- # shift 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@27 -- # [[ -n 22 ]] 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@28 -- # [[ -z '' ]] 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@31 -- # app_pid[$app]=3655799 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@33 -- # echo 'Waiting for target to run...' 00:05:28.467 Waiting for target to run... 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@34 -- # waitforlisten 3655799 /var/tmp/spdk_tgt.sock 00:05:28.467 04:49:03 -- common/autotest_common.sh@829 -- # '[' -z 3655799 ']' 00:05:28.467 04:49:03 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:28.467 04:49:03 -- json_config/json_config_extra_key.sh@30 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/extra_key.json 00:05:28.467 04:49:03 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:28.467 04:49:03 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:28.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:28.467 04:49:03 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:28.467 04:49:03 -- common/autotest_common.sh@10 -- # set +x 00:05:28.467 [2024-11-08 04:49:03.351560] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:28.467 [2024-11-08 04:49:03.351651] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3655799 ] 00:05:28.467 EAL: No free 2048 kB hugepages reported on node 1 00:05:28.725 [2024-11-08 04:49:03.632591] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:28.725 [2024-11-08 04:49:03.695708] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:28.725 [2024-11-08 04:49:03.695794] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:29.291 04:49:04 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:29.291 04:49:04 -- common/autotest_common.sh@862 -- # return 0 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@35 -- # echo '' 00:05:29.291 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@79 -- # echo 'INFO: shutting down applications...' 00:05:29.291 INFO: shutting down applications... 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@80 -- # json_config_test_shutdown_app target 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@40 -- # local app=target 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@43 -- # [[ -n 22 ]] 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@44 -- # [[ -n 3655799 ]] 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@47 -- # kill -SIGINT 3655799 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@49 -- # (( i = 0 )) 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3655799 00:05:29.291 04:49:04 -- json_config/json_config_extra_key.sh@54 -- # sleep 0.5 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@49 -- # (( i++ )) 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@49 -- # (( i < 30 )) 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@50 -- # kill -0 3655799 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@51 -- # app_pid[$app]= 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@52 -- # break 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@57 -- # [[ -n '' ]] 00:05:29.859 04:49:04 -- json_config/json_config_extra_key.sh@62 -- # echo 'SPDK target shutdown done' 00:05:29.860 SPDK target shutdown done 00:05:29.860 04:49:04 -- json_config/json_config_extra_key.sh@82 -- # echo Success 00:05:29.860 Success 00:05:29.860 00:05:29.860 real 0m1.536s 00:05:29.860 user 0m1.266s 00:05:29.860 sys 0m0.424s 00:05:29.860 04:49:04 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:29.860 04:49:04 -- common/autotest_common.sh@10 -- # set +x 00:05:29.860 ************************************ 00:05:29.860 END TEST json_config_extra_key 00:05:29.860 ************************************ 00:05:29.860 04:49:04 -- spdk/autotest.sh@167 -- # run_test alias_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:29.860 04:49:04 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:29.860 04:49:04 -- common/autotest_common.sh@10 -- # set +x 00:05:29.860 ************************************ 00:05:29.860 START TEST alias_rpc 00:05:29.860 ************************************ 00:05:29.860 04:49:04 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:29.860 * Looking for test storage... 00:05:29.860 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/alias_rpc 00:05:29.860 04:49:04 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:29.860 04:49:04 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:29.860 04:49:04 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:29.860 04:49:04 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:29.860 04:49:04 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:29.860 04:49:04 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:29.860 04:49:04 -- scripts/common.sh@335 -- # IFS=.-: 00:05:29.860 04:49:04 -- scripts/common.sh@335 -- # read -ra ver1 00:05:29.860 04:49:04 -- scripts/common.sh@336 -- # IFS=.-: 00:05:29.860 04:49:04 -- scripts/common.sh@336 -- # read -ra ver2 00:05:29.860 04:49:04 -- scripts/common.sh@337 -- # local 'op=<' 00:05:29.860 04:49:04 -- scripts/common.sh@339 -- # ver1_l=2 00:05:29.860 04:49:04 -- scripts/common.sh@340 -- # ver2_l=1 00:05:29.860 04:49:04 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:29.860 04:49:04 -- scripts/common.sh@343 -- # case "$op" in 00:05:29.860 04:49:04 -- scripts/common.sh@344 -- # : 1 00:05:29.860 04:49:04 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:29.860 04:49:04 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:29.860 04:49:04 -- scripts/common.sh@364 -- # decimal 1 00:05:29.860 04:49:04 -- scripts/common.sh@352 -- # local d=1 00:05:29.860 04:49:04 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:29.860 04:49:04 -- scripts/common.sh@354 -- # echo 1 00:05:29.860 04:49:04 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:29.860 04:49:04 -- scripts/common.sh@365 -- # decimal 2 00:05:29.860 04:49:04 -- scripts/common.sh@352 -- # local d=2 00:05:29.860 04:49:04 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:29.860 04:49:04 -- scripts/common.sh@354 -- # echo 2 00:05:29.860 04:49:04 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:29.860 04:49:04 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:29.860 04:49:04 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:29.860 04:49:04 -- scripts/common.sh@367 -- # return 0 00:05:29.860 04:49:04 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:29.860 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.860 --rc genhtml_branch_coverage=1 00:05:29.860 --rc genhtml_function_coverage=1 00:05:29.860 --rc genhtml_legend=1 00:05:29.860 --rc geninfo_all_blocks=1 00:05:29.860 --rc geninfo_unexecuted_blocks=1 00:05:29.860 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.860 ' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:29.860 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.860 --rc genhtml_branch_coverage=1 00:05:29.860 --rc genhtml_function_coverage=1 00:05:29.860 --rc genhtml_legend=1 00:05:29.860 --rc geninfo_all_blocks=1 00:05:29.860 --rc geninfo_unexecuted_blocks=1 00:05:29.860 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.860 ' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:29.860 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.860 --rc genhtml_branch_coverage=1 00:05:29.860 --rc genhtml_function_coverage=1 00:05:29.860 --rc genhtml_legend=1 00:05:29.860 --rc geninfo_all_blocks=1 00:05:29.860 --rc geninfo_unexecuted_blocks=1 00:05:29.860 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.860 ' 00:05:29.860 04:49:04 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:29.860 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:29.860 --rc genhtml_branch_coverage=1 00:05:29.860 --rc genhtml_function_coverage=1 00:05:29.860 --rc genhtml_legend=1 00:05:29.860 --rc geninfo_all_blocks=1 00:05:29.860 --rc geninfo_unexecuted_blocks=1 00:05:29.860 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:29.860 ' 00:05:29.860 04:49:04 -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:29.860 04:49:04 -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=3656117 00:05:29.860 04:49:04 -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 3656117 00:05:29.860 04:49:04 -- common/autotest_common.sh@829 -- # '[' -z 3656117 ']' 00:05:29.860 04:49:04 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:29.860 04:49:04 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:29.860 04:49:04 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:29.860 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:29.860 04:49:04 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:29.860 04:49:04 -- common/autotest_common.sh@10 -- # set +x 00:05:29.860 04:49:04 -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:29.860 [2024-11-08 04:49:04.939688] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:29.860 [2024-11-08 04:49:04.939779] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3656117 ] 00:05:30.118 EAL: No free 2048 kB hugepages reported on node 1 00:05:30.118 [2024-11-08 04:49:05.007420] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:30.118 [2024-11-08 04:49:05.082767] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:30.118 [2024-11-08 04:49:05.082871] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:30.684 04:49:05 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:30.684 04:49:05 -- common/autotest_common.sh@862 -- # return 0 00:05:30.684 04:49:05 -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py load_config -i 00:05:30.942 04:49:05 -- alias_rpc/alias_rpc.sh@19 -- # killprocess 3656117 00:05:30.942 04:49:05 -- common/autotest_common.sh@936 -- # '[' -z 3656117 ']' 00:05:30.942 04:49:05 -- common/autotest_common.sh@940 -- # kill -0 3656117 00:05:30.942 04:49:05 -- common/autotest_common.sh@941 -- # uname 00:05:30.942 04:49:05 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:30.942 04:49:05 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3656117 00:05:30.942 04:49:06 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:30.942 04:49:06 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:30.942 04:49:06 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3656117' 00:05:30.942 killing process with pid 3656117 00:05:30.942 04:49:06 -- common/autotest_common.sh@955 -- # kill 3656117 00:05:30.942 04:49:06 -- common/autotest_common.sh@960 -- # wait 3656117 00:05:31.200 00:05:31.200 real 0m1.568s 00:05:31.200 user 0m1.671s 00:05:31.200 sys 0m0.443s 00:05:31.200 04:49:06 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:31.200 04:49:06 -- common/autotest_common.sh@10 -- # set +x 00:05:31.200 ************************************ 00:05:31.200 END TEST alias_rpc 00:05:31.200 ************************************ 00:05:31.457 04:49:06 -- spdk/autotest.sh@169 -- # [[ 0 -eq 0 ]] 00:05:31.457 04:49:06 -- spdk/autotest.sh@170 -- # run_test spdkcli_tcp /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:31.457 04:49:06 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:31.457 04:49:06 -- common/autotest_common.sh@10 -- # set +x 00:05:31.457 ************************************ 00:05:31.457 START TEST spdkcli_tcp 00:05:31.457 ************************************ 00:05:31.457 04:49:06 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/tcp.sh 00:05:31.457 * Looking for test storage... 00:05:31.457 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli 00:05:31.457 04:49:06 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:31.457 04:49:06 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:31.457 04:49:06 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:31.457 04:49:06 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:31.457 04:49:06 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:31.457 04:49:06 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:31.457 04:49:06 -- scripts/common.sh@335 -- # IFS=.-: 00:05:31.457 04:49:06 -- scripts/common.sh@335 -- # read -ra ver1 00:05:31.457 04:49:06 -- scripts/common.sh@336 -- # IFS=.-: 00:05:31.457 04:49:06 -- scripts/common.sh@336 -- # read -ra ver2 00:05:31.457 04:49:06 -- scripts/common.sh@337 -- # local 'op=<' 00:05:31.457 04:49:06 -- scripts/common.sh@339 -- # ver1_l=2 00:05:31.457 04:49:06 -- scripts/common.sh@340 -- # ver2_l=1 00:05:31.457 04:49:06 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:31.457 04:49:06 -- scripts/common.sh@343 -- # case "$op" in 00:05:31.457 04:49:06 -- scripts/common.sh@344 -- # : 1 00:05:31.457 04:49:06 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:31.457 04:49:06 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:31.457 04:49:06 -- scripts/common.sh@364 -- # decimal 1 00:05:31.457 04:49:06 -- scripts/common.sh@352 -- # local d=1 00:05:31.457 04:49:06 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:31.457 04:49:06 -- scripts/common.sh@354 -- # echo 1 00:05:31.457 04:49:06 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:31.457 04:49:06 -- scripts/common.sh@365 -- # decimal 2 00:05:31.457 04:49:06 -- scripts/common.sh@352 -- # local d=2 00:05:31.457 04:49:06 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:31.457 04:49:06 -- scripts/common.sh@354 -- # echo 2 00:05:31.457 04:49:06 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:31.457 04:49:06 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:31.457 04:49:06 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:31.457 04:49:06 -- scripts/common.sh@367 -- # return 0 00:05:31.457 04:49:06 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:31.457 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.457 --rc genhtml_branch_coverage=1 00:05:31.457 --rc genhtml_function_coverage=1 00:05:31.457 --rc genhtml_legend=1 00:05:31.457 --rc geninfo_all_blocks=1 00:05:31.457 --rc geninfo_unexecuted_blocks=1 00:05:31.457 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.457 ' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:31.457 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.457 --rc genhtml_branch_coverage=1 00:05:31.457 --rc genhtml_function_coverage=1 00:05:31.457 --rc genhtml_legend=1 00:05:31.457 --rc geninfo_all_blocks=1 00:05:31.457 --rc geninfo_unexecuted_blocks=1 00:05:31.457 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.457 ' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:31.457 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.457 --rc genhtml_branch_coverage=1 00:05:31.457 --rc genhtml_function_coverage=1 00:05:31.457 --rc genhtml_legend=1 00:05:31.457 --rc geninfo_all_blocks=1 00:05:31.457 --rc geninfo_unexecuted_blocks=1 00:05:31.457 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.457 ' 00:05:31.457 04:49:06 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:31.457 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:31.457 --rc genhtml_branch_coverage=1 00:05:31.457 --rc genhtml_function_coverage=1 00:05:31.457 --rc genhtml_legend=1 00:05:31.457 --rc geninfo_all_blocks=1 00:05:31.457 --rc geninfo_unexecuted_blocks=1 00:05:31.457 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:31.457 ' 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/common.sh 00:05:31.457 04:49:06 -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:05:31.457 04:49:06 -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/json_config/clear_config.py 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:31.457 04:49:06 -- common/autotest_common.sh@722 -- # xtrace_disable 00:05:31.457 04:49:06 -- common/autotest_common.sh@10 -- # set +x 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=3656459 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@27 -- # waitforlisten 3656459 00:05:31.457 04:49:06 -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:31.457 04:49:06 -- common/autotest_common.sh@829 -- # '[' -z 3656459 ']' 00:05:31.457 04:49:06 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:31.457 04:49:06 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:31.457 04:49:06 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:31.457 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:31.457 04:49:06 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:31.457 04:49:06 -- common/autotest_common.sh@10 -- # set +x 00:05:31.457 [2024-11-08 04:49:06.548308] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:31.457 [2024-11-08 04:49:06.548375] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3656459 ] 00:05:31.715 EAL: No free 2048 kB hugepages reported on node 1 00:05:31.715 [2024-11-08 04:49:06.614276] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:31.715 [2024-11-08 04:49:06.682654] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:31.715 [2024-11-08 04:49:06.682825] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:31.715 [2024-11-08 04:49:06.682826] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.281 04:49:07 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:32.281 04:49:07 -- common/autotest_common.sh@862 -- # return 0 00:05:32.281 04:49:07 -- spdkcli/tcp.sh@31 -- # socat_pid=3656672 00:05:32.281 04:49:07 -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:32.281 04:49:07 -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:32.539 [ 00:05:32.539 "spdk_get_version", 00:05:32.539 "rpc_get_methods", 00:05:32.539 "trace_get_info", 00:05:32.539 "trace_get_tpoint_group_mask", 00:05:32.539 "trace_disable_tpoint_group", 00:05:32.539 "trace_enable_tpoint_group", 00:05:32.539 "trace_clear_tpoint_mask", 00:05:32.539 "trace_set_tpoint_mask", 00:05:32.539 "vfu_tgt_set_base_path", 00:05:32.539 "framework_get_pci_devices", 00:05:32.539 "framework_get_config", 00:05:32.539 "framework_get_subsystems", 00:05:32.539 "iobuf_get_stats", 00:05:32.539 "iobuf_set_options", 00:05:32.539 "sock_set_default_impl", 00:05:32.539 "sock_impl_set_options", 00:05:32.539 "sock_impl_get_options", 00:05:32.539 "vmd_rescan", 00:05:32.539 "vmd_remove_device", 00:05:32.539 "vmd_enable", 00:05:32.539 "accel_get_stats", 00:05:32.539 "accel_set_options", 00:05:32.539 "accel_set_driver", 00:05:32.539 "accel_crypto_key_destroy", 00:05:32.539 "accel_crypto_keys_get", 00:05:32.539 "accel_crypto_key_create", 00:05:32.539 "accel_assign_opc", 00:05:32.539 "accel_get_module_info", 00:05:32.539 "accel_get_opc_assignments", 00:05:32.539 "notify_get_notifications", 00:05:32.539 "notify_get_types", 00:05:32.539 "bdev_get_histogram", 00:05:32.539 "bdev_enable_histogram", 00:05:32.539 "bdev_set_qos_limit", 00:05:32.539 "bdev_set_qd_sampling_period", 00:05:32.539 "bdev_get_bdevs", 00:05:32.539 "bdev_reset_iostat", 00:05:32.539 "bdev_get_iostat", 00:05:32.539 "bdev_examine", 00:05:32.539 "bdev_wait_for_examine", 00:05:32.539 "bdev_set_options", 00:05:32.539 "scsi_get_devices", 00:05:32.539 "thread_set_cpumask", 00:05:32.539 "framework_get_scheduler", 00:05:32.539 "framework_set_scheduler", 00:05:32.539 "framework_get_reactors", 00:05:32.539 "thread_get_io_channels", 00:05:32.539 "thread_get_pollers", 00:05:32.539 "thread_get_stats", 00:05:32.539 "framework_monitor_context_switch", 00:05:32.539 "spdk_kill_instance", 00:05:32.539 "log_enable_timestamps", 00:05:32.539 "log_get_flags", 00:05:32.539 "log_clear_flag", 00:05:32.539 "log_set_flag", 00:05:32.539 "log_get_level", 00:05:32.539 "log_set_level", 00:05:32.539 "log_get_print_level", 00:05:32.539 "log_set_print_level", 00:05:32.539 "framework_enable_cpumask_locks", 00:05:32.539 "framework_disable_cpumask_locks", 00:05:32.539 "framework_wait_init", 00:05:32.539 "framework_start_init", 00:05:32.539 "virtio_blk_create_transport", 00:05:32.539 "virtio_blk_get_transports", 00:05:32.539 "vhost_controller_set_coalescing", 00:05:32.539 "vhost_get_controllers", 00:05:32.539 "vhost_delete_controller", 00:05:32.539 "vhost_create_blk_controller", 00:05:32.539 "vhost_scsi_controller_remove_target", 00:05:32.539 "vhost_scsi_controller_add_target", 00:05:32.539 "vhost_start_scsi_controller", 00:05:32.539 "vhost_create_scsi_controller", 00:05:32.539 "ublk_recover_disk", 00:05:32.539 "ublk_get_disks", 00:05:32.539 "ublk_stop_disk", 00:05:32.539 "ublk_start_disk", 00:05:32.539 "ublk_destroy_target", 00:05:32.539 "ublk_create_target", 00:05:32.539 "nbd_get_disks", 00:05:32.539 "nbd_stop_disk", 00:05:32.539 "nbd_start_disk", 00:05:32.539 "env_dpdk_get_mem_stats", 00:05:32.539 "nvmf_subsystem_get_listeners", 00:05:32.539 "nvmf_subsystem_get_qpairs", 00:05:32.539 "nvmf_subsystem_get_controllers", 00:05:32.539 "nvmf_get_stats", 00:05:32.539 "nvmf_get_transports", 00:05:32.539 "nvmf_create_transport", 00:05:32.539 "nvmf_get_targets", 00:05:32.539 "nvmf_delete_target", 00:05:32.539 "nvmf_create_target", 00:05:32.539 "nvmf_subsystem_allow_any_host", 00:05:32.539 "nvmf_subsystem_remove_host", 00:05:32.539 "nvmf_subsystem_add_host", 00:05:32.539 "nvmf_subsystem_remove_ns", 00:05:32.539 "nvmf_subsystem_add_ns", 00:05:32.539 "nvmf_subsystem_listener_set_ana_state", 00:05:32.539 "nvmf_discovery_get_referrals", 00:05:32.539 "nvmf_discovery_remove_referral", 00:05:32.539 "nvmf_discovery_add_referral", 00:05:32.539 "nvmf_subsystem_remove_listener", 00:05:32.539 "nvmf_subsystem_add_listener", 00:05:32.539 "nvmf_delete_subsystem", 00:05:32.539 "nvmf_create_subsystem", 00:05:32.539 "nvmf_get_subsystems", 00:05:32.539 "nvmf_set_crdt", 00:05:32.539 "nvmf_set_config", 00:05:32.539 "nvmf_set_max_subsystems", 00:05:32.539 "iscsi_set_options", 00:05:32.539 "iscsi_get_auth_groups", 00:05:32.539 "iscsi_auth_group_remove_secret", 00:05:32.539 "iscsi_auth_group_add_secret", 00:05:32.539 "iscsi_delete_auth_group", 00:05:32.539 "iscsi_create_auth_group", 00:05:32.539 "iscsi_set_discovery_auth", 00:05:32.539 "iscsi_get_options", 00:05:32.539 "iscsi_target_node_request_logout", 00:05:32.539 "iscsi_target_node_set_redirect", 00:05:32.539 "iscsi_target_node_set_auth", 00:05:32.539 "iscsi_target_node_add_lun", 00:05:32.539 "iscsi_get_connections", 00:05:32.539 "iscsi_portal_group_set_auth", 00:05:32.539 "iscsi_start_portal_group", 00:05:32.539 "iscsi_delete_portal_group", 00:05:32.539 "iscsi_create_portal_group", 00:05:32.539 "iscsi_get_portal_groups", 00:05:32.539 "iscsi_delete_target_node", 00:05:32.539 "iscsi_target_node_remove_pg_ig_maps", 00:05:32.539 "iscsi_target_node_add_pg_ig_maps", 00:05:32.539 "iscsi_create_target_node", 00:05:32.539 "iscsi_get_target_nodes", 00:05:32.539 "iscsi_delete_initiator_group", 00:05:32.539 "iscsi_initiator_group_remove_initiators", 00:05:32.539 "iscsi_initiator_group_add_initiators", 00:05:32.539 "iscsi_create_initiator_group", 00:05:32.539 "iscsi_get_initiator_groups", 00:05:32.539 "vfu_virtio_create_scsi_endpoint", 00:05:32.539 "vfu_virtio_scsi_remove_target", 00:05:32.539 "vfu_virtio_scsi_add_target", 00:05:32.539 "vfu_virtio_create_blk_endpoint", 00:05:32.539 "vfu_virtio_delete_endpoint", 00:05:32.539 "iaa_scan_accel_module", 00:05:32.539 "dsa_scan_accel_module", 00:05:32.539 "ioat_scan_accel_module", 00:05:32.539 "accel_error_inject_error", 00:05:32.540 "bdev_iscsi_delete", 00:05:32.540 "bdev_iscsi_create", 00:05:32.540 "bdev_iscsi_set_options", 00:05:32.540 "bdev_virtio_attach_controller", 00:05:32.540 "bdev_virtio_scsi_get_devices", 00:05:32.540 "bdev_virtio_detach_controller", 00:05:32.540 "bdev_virtio_blk_set_hotplug", 00:05:32.540 "bdev_ftl_set_property", 00:05:32.540 "bdev_ftl_get_properties", 00:05:32.540 "bdev_ftl_get_stats", 00:05:32.540 "bdev_ftl_unmap", 00:05:32.540 "bdev_ftl_unload", 00:05:32.540 "bdev_ftl_delete", 00:05:32.540 "bdev_ftl_load", 00:05:32.540 "bdev_ftl_create", 00:05:32.540 "bdev_aio_delete", 00:05:32.540 "bdev_aio_rescan", 00:05:32.540 "bdev_aio_create", 00:05:32.540 "blobfs_create", 00:05:32.540 "blobfs_detect", 00:05:32.540 "blobfs_set_cache_size", 00:05:32.540 "bdev_zone_block_delete", 00:05:32.540 "bdev_zone_block_create", 00:05:32.540 "bdev_delay_delete", 00:05:32.540 "bdev_delay_create", 00:05:32.540 "bdev_delay_update_latency", 00:05:32.540 "bdev_split_delete", 00:05:32.540 "bdev_split_create", 00:05:32.540 "bdev_error_inject_error", 00:05:32.540 "bdev_error_delete", 00:05:32.540 "bdev_error_create", 00:05:32.540 "bdev_raid_set_options", 00:05:32.540 "bdev_raid_remove_base_bdev", 00:05:32.540 "bdev_raid_add_base_bdev", 00:05:32.540 "bdev_raid_delete", 00:05:32.540 "bdev_raid_create", 00:05:32.540 "bdev_raid_get_bdevs", 00:05:32.540 "bdev_lvol_grow_lvstore", 00:05:32.540 "bdev_lvol_get_lvols", 00:05:32.540 "bdev_lvol_get_lvstores", 00:05:32.540 "bdev_lvol_delete", 00:05:32.540 "bdev_lvol_set_read_only", 00:05:32.540 "bdev_lvol_resize", 00:05:32.540 "bdev_lvol_decouple_parent", 00:05:32.540 "bdev_lvol_inflate", 00:05:32.540 "bdev_lvol_rename", 00:05:32.540 "bdev_lvol_clone_bdev", 00:05:32.540 "bdev_lvol_clone", 00:05:32.540 "bdev_lvol_snapshot", 00:05:32.540 "bdev_lvol_create", 00:05:32.540 "bdev_lvol_delete_lvstore", 00:05:32.540 "bdev_lvol_rename_lvstore", 00:05:32.540 "bdev_lvol_create_lvstore", 00:05:32.540 "bdev_passthru_delete", 00:05:32.540 "bdev_passthru_create", 00:05:32.540 "bdev_nvme_cuse_unregister", 00:05:32.540 "bdev_nvme_cuse_register", 00:05:32.540 "bdev_opal_new_user", 00:05:32.540 "bdev_opal_set_lock_state", 00:05:32.540 "bdev_opal_delete", 00:05:32.540 "bdev_opal_get_info", 00:05:32.540 "bdev_opal_create", 00:05:32.540 "bdev_nvme_opal_revert", 00:05:32.540 "bdev_nvme_opal_init", 00:05:32.540 "bdev_nvme_send_cmd", 00:05:32.540 "bdev_nvme_get_path_iostat", 00:05:32.540 "bdev_nvme_get_mdns_discovery_info", 00:05:32.540 "bdev_nvme_stop_mdns_discovery", 00:05:32.540 "bdev_nvme_start_mdns_discovery", 00:05:32.540 "bdev_nvme_set_multipath_policy", 00:05:32.540 "bdev_nvme_set_preferred_path", 00:05:32.540 "bdev_nvme_get_io_paths", 00:05:32.540 "bdev_nvme_remove_error_injection", 00:05:32.540 "bdev_nvme_add_error_injection", 00:05:32.540 "bdev_nvme_get_discovery_info", 00:05:32.540 "bdev_nvme_stop_discovery", 00:05:32.540 "bdev_nvme_start_discovery", 00:05:32.540 "bdev_nvme_get_controller_health_info", 00:05:32.540 "bdev_nvme_disable_controller", 00:05:32.540 "bdev_nvme_enable_controller", 00:05:32.540 "bdev_nvme_reset_controller", 00:05:32.540 "bdev_nvme_get_transport_statistics", 00:05:32.540 "bdev_nvme_apply_firmware", 00:05:32.540 "bdev_nvme_detach_controller", 00:05:32.540 "bdev_nvme_get_controllers", 00:05:32.540 "bdev_nvme_attach_controller", 00:05:32.540 "bdev_nvme_set_hotplug", 00:05:32.540 "bdev_nvme_set_options", 00:05:32.540 "bdev_null_resize", 00:05:32.540 "bdev_null_delete", 00:05:32.540 "bdev_null_create", 00:05:32.540 "bdev_malloc_delete", 00:05:32.540 "bdev_malloc_create" 00:05:32.540 ] 00:05:32.540 04:49:07 -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:32.540 04:49:07 -- common/autotest_common.sh@728 -- # xtrace_disable 00:05:32.540 04:49:07 -- common/autotest_common.sh@10 -- # set +x 00:05:32.540 04:49:07 -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:32.540 04:49:07 -- spdkcli/tcp.sh@38 -- # killprocess 3656459 00:05:32.540 04:49:07 -- common/autotest_common.sh@936 -- # '[' -z 3656459 ']' 00:05:32.540 04:49:07 -- common/autotest_common.sh@940 -- # kill -0 3656459 00:05:32.540 04:49:07 -- common/autotest_common.sh@941 -- # uname 00:05:32.540 04:49:07 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:32.540 04:49:07 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3656459 00:05:32.798 04:49:07 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:32.798 04:49:07 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:32.798 04:49:07 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3656459' 00:05:32.798 killing process with pid 3656459 00:05:32.798 04:49:07 -- common/autotest_common.sh@955 -- # kill 3656459 00:05:32.798 04:49:07 -- common/autotest_common.sh@960 -- # wait 3656459 00:05:33.057 00:05:33.057 real 0m1.625s 00:05:33.057 user 0m2.977s 00:05:33.057 sys 0m0.511s 00:05:33.057 04:49:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:33.057 04:49:07 -- common/autotest_common.sh@10 -- # set +x 00:05:33.057 ************************************ 00:05:33.057 END TEST spdkcli_tcp 00:05:33.057 ************************************ 00:05:33.057 04:49:08 -- spdk/autotest.sh@173 -- # run_test dpdk_mem_utility /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:33.057 04:49:08 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:33.057 04:49:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:33.057 04:49:08 -- common/autotest_common.sh@10 -- # set +x 00:05:33.057 ************************************ 00:05:33.057 START TEST dpdk_mem_utility 00:05:33.057 ************************************ 00:05:33.057 04:49:08 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:33.057 * Looking for test storage... 00:05:33.057 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/dpdk_memory_utility 00:05:33.057 04:49:08 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:33.057 04:49:08 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:33.057 04:49:08 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:33.316 04:49:08 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:33.316 04:49:08 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:33.316 04:49:08 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:33.316 04:49:08 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:33.316 04:49:08 -- scripts/common.sh@335 -- # IFS=.-: 00:05:33.316 04:49:08 -- scripts/common.sh@335 -- # read -ra ver1 00:05:33.316 04:49:08 -- scripts/common.sh@336 -- # IFS=.-: 00:05:33.316 04:49:08 -- scripts/common.sh@336 -- # read -ra ver2 00:05:33.316 04:49:08 -- scripts/common.sh@337 -- # local 'op=<' 00:05:33.316 04:49:08 -- scripts/common.sh@339 -- # ver1_l=2 00:05:33.316 04:49:08 -- scripts/common.sh@340 -- # ver2_l=1 00:05:33.316 04:49:08 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:33.316 04:49:08 -- scripts/common.sh@343 -- # case "$op" in 00:05:33.316 04:49:08 -- scripts/common.sh@344 -- # : 1 00:05:33.316 04:49:08 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:33.316 04:49:08 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:33.316 04:49:08 -- scripts/common.sh@364 -- # decimal 1 00:05:33.316 04:49:08 -- scripts/common.sh@352 -- # local d=1 00:05:33.316 04:49:08 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:33.316 04:49:08 -- scripts/common.sh@354 -- # echo 1 00:05:33.316 04:49:08 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:33.316 04:49:08 -- scripts/common.sh@365 -- # decimal 2 00:05:33.316 04:49:08 -- scripts/common.sh@352 -- # local d=2 00:05:33.316 04:49:08 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:33.316 04:49:08 -- scripts/common.sh@354 -- # echo 2 00:05:33.316 04:49:08 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:33.316 04:49:08 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:33.316 04:49:08 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:33.316 04:49:08 -- scripts/common.sh@367 -- # return 0 00:05:33.316 04:49:08 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:33.316 04:49:08 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:33.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.316 --rc genhtml_branch_coverage=1 00:05:33.316 --rc genhtml_function_coverage=1 00:05:33.316 --rc genhtml_legend=1 00:05:33.316 --rc geninfo_all_blocks=1 00:05:33.316 --rc geninfo_unexecuted_blocks=1 00:05:33.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.316 ' 00:05:33.316 04:49:08 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:33.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.316 --rc genhtml_branch_coverage=1 00:05:33.316 --rc genhtml_function_coverage=1 00:05:33.316 --rc genhtml_legend=1 00:05:33.316 --rc geninfo_all_blocks=1 00:05:33.316 --rc geninfo_unexecuted_blocks=1 00:05:33.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.316 ' 00:05:33.316 04:49:08 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:33.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.316 --rc genhtml_branch_coverage=1 00:05:33.316 --rc genhtml_function_coverage=1 00:05:33.316 --rc genhtml_legend=1 00:05:33.316 --rc geninfo_all_blocks=1 00:05:33.316 --rc geninfo_unexecuted_blocks=1 00:05:33.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.316 ' 00:05:33.316 04:49:08 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:33.316 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:33.316 --rc genhtml_branch_coverage=1 00:05:33.316 --rc genhtml_function_coverage=1 00:05:33.316 --rc genhtml_legend=1 00:05:33.316 --rc geninfo_all_blocks=1 00:05:33.316 --rc geninfo_unexecuted_blocks=1 00:05:33.316 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:33.316 ' 00:05:33.316 04:49:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:33.316 04:49:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=3656804 00:05:33.316 04:49:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 3656804 00:05:33.316 04:49:08 -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt 00:05:33.316 04:49:08 -- common/autotest_common.sh@829 -- # '[' -z 3656804 ']' 00:05:33.316 04:49:08 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:33.316 04:49:08 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:33.316 04:49:08 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:33.316 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:33.316 04:49:08 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:33.316 04:49:08 -- common/autotest_common.sh@10 -- # set +x 00:05:33.316 [2024-11-08 04:49:08.224011] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:33.316 [2024-11-08 04:49:08.224105] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3656804 ] 00:05:33.316 EAL: No free 2048 kB hugepages reported on node 1 00:05:33.316 [2024-11-08 04:49:08.291552] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.316 [2024-11-08 04:49:08.365984] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:05:33.316 [2024-11-08 04:49:08.366090] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.252 04:49:09 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:34.252 04:49:09 -- common/autotest_common.sh@862 -- # return 0 00:05:34.252 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:34.252 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:34.252 04:49:09 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:34.252 04:49:09 -- common/autotest_common.sh@10 -- # set +x 00:05:34.252 { 00:05:34.252 "filename": "/tmp/spdk_mem_dump.txt" 00:05:34.252 } 00:05:34.252 04:49:09 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:34.252 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:05:34.252 DPDK memory size 814.000000 MiB in 1 heap(s) 00:05:34.252 1 heaps totaling size 814.000000 MiB 00:05:34.252 size: 814.000000 MiB heap id: 0 00:05:34.252 end heaps---------- 00:05:34.252 8 mempools totaling size 598.116089 MiB 00:05:34.252 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:34.252 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:34.252 size: 84.521057 MiB name: bdev_io_3656804 00:05:34.252 size: 51.011292 MiB name: evtpool_3656804 00:05:34.252 size: 50.003479 MiB name: msgpool_3656804 00:05:34.252 size: 21.763794 MiB name: PDU_Pool 00:05:34.252 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:34.252 size: 0.026123 MiB name: Session_Pool 00:05:34.252 end mempools------- 00:05:34.252 6 memzones totaling size 4.142822 MiB 00:05:34.252 size: 1.000366 MiB name: RG_ring_0_3656804 00:05:34.252 size: 1.000366 MiB name: RG_ring_1_3656804 00:05:34.252 size: 1.000366 MiB name: RG_ring_4_3656804 00:05:34.252 size: 1.000366 MiB name: RG_ring_5_3656804 00:05:34.252 size: 0.125366 MiB name: RG_ring_2_3656804 00:05:34.252 size: 0.015991 MiB name: RG_ring_3_3656804 00:05:34.252 end memzones------- 00:05:34.252 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:05:34.252 heap id: 0 total size: 814.000000 MiB number of busy elements: 41 number of free elements: 15 00:05:34.252 list of free elements. size: 12.519348 MiB 00:05:34.252 element at address: 0x200000400000 with size: 1.999512 MiB 00:05:34.252 element at address: 0x200018e00000 with size: 0.999878 MiB 00:05:34.252 element at address: 0x200019000000 with size: 0.999878 MiB 00:05:34.252 element at address: 0x200003e00000 with size: 0.996277 MiB 00:05:34.252 element at address: 0x200031c00000 with size: 0.994446 MiB 00:05:34.252 element at address: 0x200013800000 with size: 0.978699 MiB 00:05:34.252 element at address: 0x200007000000 with size: 0.959839 MiB 00:05:34.252 element at address: 0x200019200000 with size: 0.936584 MiB 00:05:34.252 element at address: 0x200000200000 with size: 0.841614 MiB 00:05:34.252 element at address: 0x20001aa00000 with size: 0.582886 MiB 00:05:34.252 element at address: 0x20000b200000 with size: 0.490723 MiB 00:05:34.252 element at address: 0x200000800000 with size: 0.487793 MiB 00:05:34.252 element at address: 0x200019400000 with size: 0.485657 MiB 00:05:34.252 element at address: 0x200027e00000 with size: 0.410034 MiB 00:05:34.252 element at address: 0x200003a00000 with size: 0.355530 MiB 00:05:34.252 list of standard malloc elements. size: 199.218079 MiB 00:05:34.253 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:05:34.253 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:05:34.253 element at address: 0x200018efff80 with size: 1.000122 MiB 00:05:34.253 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:05:34.253 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:34.253 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:34.253 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:05:34.253 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:34.253 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:05:34.253 element at address: 0x2000002d7740 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000002d7800 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000002d78c0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000002d7ac0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000002d7b80 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003a5b040 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003adb300 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003adb500 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003adf7c0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003affa80 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003affb40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200003eff0c0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:05:34.253 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:05:34.253 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200027e68f80 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200027e69040 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200027e6fc40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:05:34.253 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:05:34.253 list of memzone associated elements. size: 602.262573 MiB 00:05:34.253 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:05:34.253 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:34.253 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:05:34.253 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:34.253 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:05:34.253 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_3656804_0 00:05:34.253 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:05:34.253 associated memzone info: size: 48.002930 MiB name: MP_evtpool_3656804_0 00:05:34.253 element at address: 0x200003fff380 with size: 48.003052 MiB 00:05:34.253 associated memzone info: size: 48.002930 MiB name: MP_msgpool_3656804_0 00:05:34.253 element at address: 0x2000195be940 with size: 20.255554 MiB 00:05:34.253 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:34.253 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:05:34.253 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:34.253 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:05:34.253 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_3656804 00:05:34.253 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:05:34.253 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_3656804 00:05:34.253 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:34.253 associated memzone info: size: 1.007996 MiB name: MP_evtpool_3656804 00:05:34.253 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:05:34.253 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:34.253 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:05:34.253 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:34.253 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:05:34.253 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:34.253 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:05:34.253 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:34.253 element at address: 0x200003eff180 with size: 1.000488 MiB 00:05:34.253 associated memzone info: size: 1.000366 MiB name: RG_ring_0_3656804 00:05:34.253 element at address: 0x200003affc00 with size: 1.000488 MiB 00:05:34.253 associated memzone info: size: 1.000366 MiB name: RG_ring_1_3656804 00:05:34.253 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:05:34.253 associated memzone info: size: 1.000366 MiB name: RG_ring_4_3656804 00:05:34.253 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:05:34.253 associated memzone info: size: 1.000366 MiB name: RG_ring_5_3656804 00:05:34.253 element at address: 0x200003a5b100 with size: 0.500488 MiB 00:05:34.253 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_3656804 00:05:34.253 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:05:34.253 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:34.253 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:05:34.253 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:34.253 element at address: 0x20001947c540 with size: 0.250488 MiB 00:05:34.253 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:34.253 element at address: 0x200003adf880 with size: 0.125488 MiB 00:05:34.253 associated memzone info: size: 0.125366 MiB name: RG_ring_2_3656804 00:05:34.253 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:05:34.253 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:34.253 element at address: 0x200027e69100 with size: 0.023743 MiB 00:05:34.253 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:34.253 element at address: 0x200003adb5c0 with size: 0.016113 MiB 00:05:34.253 associated memzone info: size: 0.015991 MiB name: RG_ring_3_3656804 00:05:34.253 element at address: 0x200027e6f240 with size: 0.002441 MiB 00:05:34.253 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:34.253 element at address: 0x2000002d7980 with size: 0.000305 MiB 00:05:34.253 associated memzone info: size: 0.000183 MiB name: MP_msgpool_3656804 00:05:34.253 element at address: 0x200003adb3c0 with size: 0.000305 MiB 00:05:34.253 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_3656804 00:05:34.253 element at address: 0x200027e6fd00 with size: 0.000305 MiB 00:05:34.253 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:34.253 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:34.253 04:49:09 -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 3656804 00:05:34.253 04:49:09 -- common/autotest_common.sh@936 -- # '[' -z 3656804 ']' 00:05:34.253 04:49:09 -- common/autotest_common.sh@940 -- # kill -0 3656804 00:05:34.253 04:49:09 -- common/autotest_common.sh@941 -- # uname 00:05:34.253 04:49:09 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:34.253 04:49:09 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3656804 00:05:34.253 04:49:09 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:05:34.253 04:49:09 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:05:34.253 04:49:09 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3656804' 00:05:34.253 killing process with pid 3656804 00:05:34.253 04:49:09 -- common/autotest_common.sh@955 -- # kill 3656804 00:05:34.253 04:49:09 -- common/autotest_common.sh@960 -- # wait 3656804 00:05:34.512 00:05:34.512 real 0m1.475s 00:05:34.512 user 0m1.521s 00:05:34.512 sys 0m0.440s 00:05:34.512 04:49:09 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:34.512 04:49:09 -- common/autotest_common.sh@10 -- # set +x 00:05:34.512 ************************************ 00:05:34.512 END TEST dpdk_mem_utility 00:05:34.512 ************************************ 00:05:34.512 04:49:09 -- spdk/autotest.sh@174 -- # run_test event /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:34.512 04:49:09 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:34.512 04:49:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.512 04:49:09 -- common/autotest_common.sh@10 -- # set +x 00:05:34.512 ************************************ 00:05:34.512 START TEST event 00:05:34.512 ************************************ 00:05:34.512 04:49:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event.sh 00:05:34.770 * Looking for test storage... 00:05:34.770 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:05:34.770 04:49:09 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:34.770 04:49:09 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:34.770 04:49:09 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:34.770 04:49:09 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:34.770 04:49:09 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:34.770 04:49:09 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:34.770 04:49:09 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:34.770 04:49:09 -- scripts/common.sh@335 -- # IFS=.-: 00:05:34.770 04:49:09 -- scripts/common.sh@335 -- # read -ra ver1 00:05:34.770 04:49:09 -- scripts/common.sh@336 -- # IFS=.-: 00:05:34.770 04:49:09 -- scripts/common.sh@336 -- # read -ra ver2 00:05:34.770 04:49:09 -- scripts/common.sh@337 -- # local 'op=<' 00:05:34.770 04:49:09 -- scripts/common.sh@339 -- # ver1_l=2 00:05:34.770 04:49:09 -- scripts/common.sh@340 -- # ver2_l=1 00:05:34.770 04:49:09 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:34.770 04:49:09 -- scripts/common.sh@343 -- # case "$op" in 00:05:34.771 04:49:09 -- scripts/common.sh@344 -- # : 1 00:05:34.771 04:49:09 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:34.771 04:49:09 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:34.771 04:49:09 -- scripts/common.sh@364 -- # decimal 1 00:05:34.771 04:49:09 -- scripts/common.sh@352 -- # local d=1 00:05:34.771 04:49:09 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:34.771 04:49:09 -- scripts/common.sh@354 -- # echo 1 00:05:34.771 04:49:09 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:34.771 04:49:09 -- scripts/common.sh@365 -- # decimal 2 00:05:34.771 04:49:09 -- scripts/common.sh@352 -- # local d=2 00:05:34.771 04:49:09 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:34.771 04:49:09 -- scripts/common.sh@354 -- # echo 2 00:05:34.771 04:49:09 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:34.771 04:49:09 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:34.771 04:49:09 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:34.771 04:49:09 -- scripts/common.sh@367 -- # return 0 00:05:34.771 04:49:09 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:34.771 04:49:09 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:34.771 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.771 --rc genhtml_branch_coverage=1 00:05:34.771 --rc genhtml_function_coverage=1 00:05:34.771 --rc genhtml_legend=1 00:05:34.771 --rc geninfo_all_blocks=1 00:05:34.771 --rc geninfo_unexecuted_blocks=1 00:05:34.771 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.771 ' 00:05:34.771 04:49:09 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:34.771 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.771 --rc genhtml_branch_coverage=1 00:05:34.771 --rc genhtml_function_coverage=1 00:05:34.771 --rc genhtml_legend=1 00:05:34.771 --rc geninfo_all_blocks=1 00:05:34.771 --rc geninfo_unexecuted_blocks=1 00:05:34.771 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.771 ' 00:05:34.771 04:49:09 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:34.771 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.771 --rc genhtml_branch_coverage=1 00:05:34.771 --rc genhtml_function_coverage=1 00:05:34.771 --rc genhtml_legend=1 00:05:34.771 --rc geninfo_all_blocks=1 00:05:34.771 --rc geninfo_unexecuted_blocks=1 00:05:34.771 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.771 ' 00:05:34.771 04:49:09 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:34.771 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:34.771 --rc genhtml_branch_coverage=1 00:05:34.771 --rc genhtml_function_coverage=1 00:05:34.771 --rc genhtml_legend=1 00:05:34.771 --rc geninfo_all_blocks=1 00:05:34.771 --rc geninfo_unexecuted_blocks=1 00:05:34.771 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:34.771 ' 00:05:34.771 04:49:09 -- event/event.sh@9 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/bdev/nbd_common.sh 00:05:34.771 04:49:09 -- bdev/nbd_common.sh@6 -- # set -e 00:05:34.771 04:49:09 -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:34.771 04:49:09 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:05:34.771 04:49:09 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:34.771 04:49:09 -- common/autotest_common.sh@10 -- # set +x 00:05:34.771 ************************************ 00:05:34.771 START TEST event_perf 00:05:34.771 ************************************ 00:05:34.771 04:49:09 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:34.771 Running I/O for 1 seconds...[2024-11-08 04:49:09.759992] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:34.771 [2024-11-08 04:49:09.760083] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657136 ] 00:05:34.771 EAL: No free 2048 kB hugepages reported on node 1 00:05:34.771 [2024-11-08 04:49:09.829510] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:35.029 [2024-11-08 04:49:09.902042] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:35.029 [2024-11-08 04:49:09.902138] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:35.029 [2024-11-08 04:49:09.902211] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:35.029 [2024-11-08 04:49:09.902213] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:35.965 Running I/O for 1 seconds... 00:05:35.965 lcore 0: 191841 00:05:35.965 lcore 1: 191839 00:05:35.965 lcore 2: 191837 00:05:35.965 lcore 3: 191838 00:05:35.965 done. 00:05:35.965 00:05:35.965 real 0m1.228s 00:05:35.965 user 0m4.129s 00:05:35.965 sys 0m0.095s 00:05:35.965 04:49:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:35.965 04:49:10 -- common/autotest_common.sh@10 -- # set +x 00:05:35.965 ************************************ 00:05:35.965 END TEST event_perf 00:05:35.965 ************************************ 00:05:35.965 04:49:11 -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:35.965 04:49:11 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:35.965 04:49:11 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:35.965 04:49:11 -- common/autotest_common.sh@10 -- # set +x 00:05:35.965 ************************************ 00:05:35.965 START TEST event_reactor 00:05:35.965 ************************************ 00:05:35.965 04:49:11 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:05:35.965 [2024-11-08 04:49:11.037606] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:35.965 [2024-11-08 04:49:11.037704] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657421 ] 00:05:35.965 EAL: No free 2048 kB hugepages reported on node 1 00:05:36.224 [2024-11-08 04:49:11.107695] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:36.224 [2024-11-08 04:49:11.173397] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:37.160 test_start 00:05:37.160 oneshot 00:05:37.160 tick 100 00:05:37.160 tick 100 00:05:37.160 tick 250 00:05:37.160 tick 100 00:05:37.160 tick 100 00:05:37.160 tick 100 00:05:37.160 tick 250 00:05:37.160 tick 500 00:05:37.160 tick 100 00:05:37.160 tick 100 00:05:37.160 tick 250 00:05:37.160 tick 100 00:05:37.160 tick 100 00:05:37.160 test_end 00:05:37.160 00:05:37.160 real 0m1.216s 00:05:37.160 user 0m1.125s 00:05:37.160 sys 0m0.086s 00:05:37.160 04:49:12 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:37.160 04:49:12 -- common/autotest_common.sh@10 -- # set +x 00:05:37.160 ************************************ 00:05:37.160 END TEST event_reactor 00:05:37.160 ************************************ 00:05:37.418 04:49:12 -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:37.418 04:49:12 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:05:37.418 04:49:12 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:37.418 04:49:12 -- common/autotest_common.sh@10 -- # set +x 00:05:37.418 ************************************ 00:05:37.418 START TEST event_reactor_perf 00:05:37.418 ************************************ 00:05:37.418 04:49:12 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:37.418 [2024-11-08 04:49:12.304042] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:37.418 [2024-11-08 04:49:12.304132] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3657711 ] 00:05:37.418 EAL: No free 2048 kB hugepages reported on node 1 00:05:37.418 [2024-11-08 04:49:12.374932] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:37.418 [2024-11-08 04:49:12.442430] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.431 test_start 00:05:38.431 test_end 00:05:38.431 Performance: 975352 events per second 00:05:38.431 00:05:38.431 real 0m1.219s 00:05:38.431 user 0m1.123s 00:05:38.431 sys 0m0.091s 00:05:38.431 04:49:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:38.431 04:49:13 -- common/autotest_common.sh@10 -- # set +x 00:05:38.431 ************************************ 00:05:38.431 END TEST event_reactor_perf 00:05:38.431 ************************************ 00:05:38.690 04:49:13 -- event/event.sh@49 -- # uname -s 00:05:38.690 04:49:13 -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:38.690 04:49:13 -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:38.690 04:49:13 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:38.690 04:49:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:38.690 04:49:13 -- common/autotest_common.sh@10 -- # set +x 00:05:38.690 ************************************ 00:05:38.690 START TEST event_scheduler 00:05:38.690 ************************************ 00:05:38.690 04:49:13 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:05:38.690 * Looking for test storage... 00:05:38.690 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler 00:05:38.690 04:49:13 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:05:38.690 04:49:13 -- common/autotest_common.sh@1690 -- # lcov --version 00:05:38.690 04:49:13 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:05:38.690 04:49:13 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:05:38.690 04:49:13 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:05:38.690 04:49:13 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:05:38.690 04:49:13 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:05:38.690 04:49:13 -- scripts/common.sh@335 -- # IFS=.-: 00:05:38.690 04:49:13 -- scripts/common.sh@335 -- # read -ra ver1 00:05:38.690 04:49:13 -- scripts/common.sh@336 -- # IFS=.-: 00:05:38.690 04:49:13 -- scripts/common.sh@336 -- # read -ra ver2 00:05:38.690 04:49:13 -- scripts/common.sh@337 -- # local 'op=<' 00:05:38.690 04:49:13 -- scripts/common.sh@339 -- # ver1_l=2 00:05:38.690 04:49:13 -- scripts/common.sh@340 -- # ver2_l=1 00:05:38.690 04:49:13 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:05:38.690 04:49:13 -- scripts/common.sh@343 -- # case "$op" in 00:05:38.690 04:49:13 -- scripts/common.sh@344 -- # : 1 00:05:38.690 04:49:13 -- scripts/common.sh@363 -- # (( v = 0 )) 00:05:38.690 04:49:13 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:38.690 04:49:13 -- scripts/common.sh@364 -- # decimal 1 00:05:38.690 04:49:13 -- scripts/common.sh@352 -- # local d=1 00:05:38.690 04:49:13 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:38.690 04:49:13 -- scripts/common.sh@354 -- # echo 1 00:05:38.690 04:49:13 -- scripts/common.sh@364 -- # ver1[v]=1 00:05:38.690 04:49:13 -- scripts/common.sh@365 -- # decimal 2 00:05:38.690 04:49:13 -- scripts/common.sh@352 -- # local d=2 00:05:38.690 04:49:13 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:38.690 04:49:13 -- scripts/common.sh@354 -- # echo 2 00:05:38.690 04:49:13 -- scripts/common.sh@365 -- # ver2[v]=2 00:05:38.690 04:49:13 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:05:38.690 04:49:13 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:05:38.690 04:49:13 -- scripts/common.sh@367 -- # return 0 00:05:38.690 04:49:13 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:38.691 04:49:13 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:05:38.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.691 --rc genhtml_branch_coverage=1 00:05:38.691 --rc genhtml_function_coverage=1 00:05:38.691 --rc genhtml_legend=1 00:05:38.691 --rc geninfo_all_blocks=1 00:05:38.691 --rc geninfo_unexecuted_blocks=1 00:05:38.691 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.691 ' 00:05:38.691 04:49:13 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:05:38.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.691 --rc genhtml_branch_coverage=1 00:05:38.691 --rc genhtml_function_coverage=1 00:05:38.691 --rc genhtml_legend=1 00:05:38.691 --rc geninfo_all_blocks=1 00:05:38.691 --rc geninfo_unexecuted_blocks=1 00:05:38.691 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.691 ' 00:05:38.691 04:49:13 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:05:38.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.691 --rc genhtml_branch_coverage=1 00:05:38.691 --rc genhtml_function_coverage=1 00:05:38.691 --rc genhtml_legend=1 00:05:38.691 --rc geninfo_all_blocks=1 00:05:38.691 --rc geninfo_unexecuted_blocks=1 00:05:38.691 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.691 ' 00:05:38.691 04:49:13 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:05:38.691 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:38.691 --rc genhtml_branch_coverage=1 00:05:38.691 --rc genhtml_function_coverage=1 00:05:38.691 --rc genhtml_legend=1 00:05:38.691 --rc geninfo_all_blocks=1 00:05:38.691 --rc geninfo_unexecuted_blocks=1 00:05:38.691 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:05:38.691 ' 00:05:38.691 04:49:13 -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:38.691 04:49:13 -- scheduler/scheduler.sh@35 -- # scheduler_pid=3658035 00:05:38.691 04:49:13 -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:38.691 04:49:13 -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:38.691 04:49:13 -- scheduler/scheduler.sh@37 -- # waitforlisten 3658035 00:05:38.691 04:49:13 -- common/autotest_common.sh@829 -- # '[' -z 3658035 ']' 00:05:38.691 04:49:13 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:38.691 04:49:13 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:38.691 04:49:13 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:38.691 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:38.691 04:49:13 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:38.691 04:49:13 -- common/autotest_common.sh@10 -- # set +x 00:05:38.691 [2024-11-08 04:49:13.767312] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:38.691 [2024-11-08 04:49:13.767407] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658035 ] 00:05:38.950 EAL: No free 2048 kB hugepages reported on node 1 00:05:38.950 [2024-11-08 04:49:13.832346] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:38.950 [2024-11-08 04:49:13.903811] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:38.950 [2024-11-08 04:49:13.903831] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:38.950 [2024-11-08 04:49:13.903916] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:05:38.950 [2024-11-08 04:49:13.903918] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:05:39.516 04:49:14 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:39.516 04:49:14 -- common/autotest_common.sh@862 -- # return 0 00:05:39.516 04:49:14 -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:39.516 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.516 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.516 POWER: Env isn't set yet! 00:05:39.516 POWER: Attempting to initialise ACPI cpufreq power management... 00:05:39.516 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:39.516 POWER: Cannot set governor of lcore 0 to userspace 00:05:39.516 POWER: Attempting to initialise PSTAT power management... 00:05:39.516 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:05:39.516 POWER: Initialized successfully for lcore 0 power management 00:05:39.516 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:05:39.516 POWER: Initialized successfully for lcore 1 power management 00:05:39.775 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:05:39.775 POWER: Initialized successfully for lcore 2 power management 00:05:39.775 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:05:39.775 POWER: Initialized successfully for lcore 3 power management 00:05:39.775 [2024-11-08 04:49:14.644709] scheduler_dynamic.c: 387:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:39.775 [2024-11-08 04:49:14.644726] scheduler_dynamic.c: 389:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:39.775 [2024-11-08 04:49:14.644736] scheduler_dynamic.c: 391:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 [2024-11-08 04:49:14.712667] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:39.775 04:49:14 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:39.775 04:49:14 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 ************************************ 00:05:39.775 START TEST scheduler_create_thread 00:05:39.775 ************************************ 00:05:39.775 04:49:14 -- common/autotest_common.sh@1114 -- # scheduler_create_thread 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 2 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 3 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 4 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 5 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 6 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 7 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 8 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 9 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 10 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:39.775 04:49:14 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:39.775 04:49:14 -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:39.775 04:49:14 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:39.775 04:49:14 -- common/autotest_common.sh@10 -- # set +x 00:05:40.710 04:49:15 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:40.710 04:49:15 -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:40.710 04:49:15 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:40.710 04:49:15 -- common/autotest_common.sh@10 -- # set +x 00:05:42.085 04:49:17 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:42.085 04:49:17 -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:42.085 04:49:17 -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:42.085 04:49:17 -- common/autotest_common.sh@561 -- # xtrace_disable 00:05:42.085 04:49:17 -- common/autotest_common.sh@10 -- # set +x 00:05:43.019 04:49:18 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:05:43.019 00:05:43.019 real 0m3.382s 00:05:43.019 user 0m0.023s 00:05:43.019 sys 0m0.008s 00:05:43.019 04:49:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.019 04:49:18 -- common/autotest_common.sh@10 -- # set +x 00:05:43.019 ************************************ 00:05:43.019 END TEST scheduler_create_thread 00:05:43.019 ************************************ 00:05:43.277 04:49:18 -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:43.277 04:49:18 -- scheduler/scheduler.sh@46 -- # killprocess 3658035 00:05:43.277 04:49:18 -- common/autotest_common.sh@936 -- # '[' -z 3658035 ']' 00:05:43.277 04:49:18 -- common/autotest_common.sh@940 -- # kill -0 3658035 00:05:43.277 04:49:18 -- common/autotest_common.sh@941 -- # uname 00:05:43.277 04:49:18 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:05:43.277 04:49:18 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3658035 00:05:43.277 04:49:18 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:05:43.277 04:49:18 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:05:43.277 04:49:18 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3658035' 00:05:43.277 killing process with pid 3658035 00:05:43.277 04:49:18 -- common/autotest_common.sh@955 -- # kill 3658035 00:05:43.277 04:49:18 -- common/autotest_common.sh@960 -- # wait 3658035 00:05:43.535 [2024-11-08 04:49:18.484465] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:43.535 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:05:43.535 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:05:43.535 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:05:43.535 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:05:43.535 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:05:43.535 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:05:43.535 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:05:43.535 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:05:43.794 00:05:43.794 real 0m5.147s 00:05:43.794 user 0m10.572s 00:05:43.794 sys 0m0.424s 00:05:43.794 04:49:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:05:43.794 04:49:18 -- common/autotest_common.sh@10 -- # set +x 00:05:43.794 ************************************ 00:05:43.794 END TEST event_scheduler 00:05:43.794 ************************************ 00:05:43.794 04:49:18 -- event/event.sh@51 -- # modprobe -n nbd 00:05:43.794 04:49:18 -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:43.794 04:49:18 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:05:43.794 04:49:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:05:43.794 04:49:18 -- common/autotest_common.sh@10 -- # set +x 00:05:43.794 ************************************ 00:05:43.794 START TEST app_repeat 00:05:43.794 ************************************ 00:05:43.794 04:49:18 -- common/autotest_common.sh@1114 -- # app_repeat_test 00:05:43.794 04:49:18 -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.794 04:49:18 -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.794 04:49:18 -- event/event.sh@13 -- # local nbd_list 00:05:43.794 04:49:18 -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:43.794 04:49:18 -- event/event.sh@14 -- # local bdev_list 00:05:43.794 04:49:18 -- event/event.sh@15 -- # local repeat_times=4 00:05:43.794 04:49:18 -- event/event.sh@17 -- # modprobe nbd 00:05:43.794 04:49:18 -- event/event.sh@19 -- # repeat_pid=3658902 00:05:43.794 04:49:18 -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:43.794 04:49:18 -- event/event.sh@18 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:43.794 04:49:18 -- event/event.sh@21 -- # echo 'Process app_repeat pid: 3658902' 00:05:43.794 Process app_repeat pid: 3658902 00:05:43.794 04:49:18 -- event/event.sh@23 -- # for i in {0..2} 00:05:43.794 04:49:18 -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:43.794 spdk_app_start Round 0 00:05:43.794 04:49:18 -- event/event.sh@25 -- # waitforlisten 3658902 /var/tmp/spdk-nbd.sock 00:05:43.794 04:49:18 -- common/autotest_common.sh@829 -- # '[' -z 3658902 ']' 00:05:43.794 04:49:18 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:43.794 04:49:18 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:43.794 04:49:18 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:43.794 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:43.794 04:49:18 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:43.794 04:49:18 -- common/autotest_common.sh@10 -- # set +x 00:05:43.794 [2024-11-08 04:49:18.793578] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:05:43.794 [2024-11-08 04:49:18.793659] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3658902 ] 00:05:43.794 EAL: No free 2048 kB hugepages reported on node 1 00:05:43.794 [2024-11-08 04:49:18.863727] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:44.052 [2024-11-08 04:49:18.939197] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:44.052 [2024-11-08 04:49:18.939199] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:44.624 04:49:19 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:44.624 04:49:19 -- common/autotest_common.sh@862 -- # return 0 00:05:44.624 04:49:19 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:44.885 Malloc0 00:05:44.885 04:49:19 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:45.143 Malloc1 00:05:45.143 04:49:20 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@12 -- # local i 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:45.143 /dev/nbd0 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:45.143 04:49:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:45.143 04:49:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:45.143 04:49:20 -- common/autotest_common.sh@867 -- # local i 00:05:45.143 04:49:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:45.143 04:49:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:45.143 04:49:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:45.143 04:49:20 -- common/autotest_common.sh@871 -- # break 00:05:45.143 04:49:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:45.143 04:49:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:45.143 04:49:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:45.143 1+0 records in 00:05:45.143 1+0 records out 00:05:45.143 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229942 s, 17.8 MB/s 00:05:45.143 04:49:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:45.400 04:49:20 -- common/autotest_common.sh@884 -- # size=4096 00:05:45.400 04:49:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:45.400 04:49:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:45.400 04:49:20 -- common/autotest_common.sh@887 -- # return 0 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:45.400 /dev/nbd1 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:45.400 04:49:20 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:45.400 04:49:20 -- common/autotest_common.sh@867 -- # local i 00:05:45.400 04:49:20 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:45.400 04:49:20 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:45.400 04:49:20 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:45.400 04:49:20 -- common/autotest_common.sh@871 -- # break 00:05:45.400 04:49:20 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:45.400 04:49:20 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:45.400 04:49:20 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:45.400 1+0 records in 00:05:45.400 1+0 records out 00:05:45.400 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265604 s, 15.4 MB/s 00:05:45.400 04:49:20 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:45.400 04:49:20 -- common/autotest_common.sh@884 -- # size=4096 00:05:45.400 04:49:20 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:45.400 04:49:20 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:45.400 04:49:20 -- common/autotest_common.sh@887 -- # return 0 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.400 04:49:20 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:45.657 { 00:05:45.657 "nbd_device": "/dev/nbd0", 00:05:45.657 "bdev_name": "Malloc0" 00:05:45.657 }, 00:05:45.657 { 00:05:45.657 "nbd_device": "/dev/nbd1", 00:05:45.657 "bdev_name": "Malloc1" 00:05:45.657 } 00:05:45.657 ]' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:45.657 { 00:05:45.657 "nbd_device": "/dev/nbd0", 00:05:45.657 "bdev_name": "Malloc0" 00:05:45.657 }, 00:05:45.657 { 00:05:45.657 "nbd_device": "/dev/nbd1", 00:05:45.657 "bdev_name": "Malloc1" 00:05:45.657 } 00:05:45.657 ]' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:45.657 /dev/nbd1' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:45.657 /dev/nbd1' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@65 -- # count=2 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@95 -- # count=2 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:45.657 256+0 records in 00:05:45.657 256+0 records out 00:05:45.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108257 s, 96.9 MB/s 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:45.657 256+0 records in 00:05:45.657 256+0 records out 00:05:45.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195581 s, 53.6 MB/s 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:45.657 04:49:20 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:45.657 256+0 records in 00:05:45.657 256+0 records out 00:05:45.657 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0209757 s, 50.0 MB/s 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:45.658 04:49:20 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@51 -- # local i 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@41 -- # break 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@45 -- # return 0 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:45.916 04:49:20 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@41 -- # break 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@45 -- # return 0 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:46.175 04:49:21 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@65 -- # true 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@65 -- # count=0 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@104 -- # count=0 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:46.434 04:49:21 -- bdev/nbd_common.sh@109 -- # return 0 00:05:46.434 04:49:21 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:46.692 04:49:21 -- event/event.sh@35 -- # sleep 3 00:05:46.692 [2024-11-08 04:49:21.774361] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:46.951 [2024-11-08 04:49:21.837859] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:46.951 [2024-11-08 04:49:21.837861] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:46.951 [2024-11-08 04:49:21.878558] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:46.951 [2024-11-08 04:49:21.878597] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:50.234 04:49:24 -- event/event.sh@23 -- # for i in {0..2} 00:05:50.234 04:49:24 -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:50.234 spdk_app_start Round 1 00:05:50.234 04:49:24 -- event/event.sh@25 -- # waitforlisten 3658902 /var/tmp/spdk-nbd.sock 00:05:50.234 04:49:24 -- common/autotest_common.sh@829 -- # '[' -z 3658902 ']' 00:05:50.234 04:49:24 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:50.234 04:49:24 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:50.234 04:49:24 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:50.234 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:50.234 04:49:24 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:50.234 04:49:24 -- common/autotest_common.sh@10 -- # set +x 00:05:50.234 04:49:24 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:50.234 04:49:24 -- common/autotest_common.sh@862 -- # return 0 00:05:50.234 04:49:24 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:50.234 Malloc0 00:05:50.234 04:49:24 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:50.234 Malloc1 00:05:50.234 04:49:25 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@12 -- # local i 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:50.234 /dev/nbd0 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:50.234 04:49:25 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:50.234 04:49:25 -- common/autotest_common.sh@867 -- # local i 00:05:50.234 04:49:25 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:50.234 04:49:25 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:50.234 04:49:25 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:50.234 04:49:25 -- common/autotest_common.sh@871 -- # break 00:05:50.234 04:49:25 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:50.234 04:49:25 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:50.234 04:49:25 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:50.234 1+0 records in 00:05:50.234 1+0 records out 00:05:50.234 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021669 s, 18.9 MB/s 00:05:50.234 04:49:25 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:50.234 04:49:25 -- common/autotest_common.sh@884 -- # size=4096 00:05:50.234 04:49:25 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:50.234 04:49:25 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:50.234 04:49:25 -- common/autotest_common.sh@887 -- # return 0 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:50.234 04:49:25 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:50.493 /dev/nbd1 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:50.493 04:49:25 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:50.493 04:49:25 -- common/autotest_common.sh@867 -- # local i 00:05:50.493 04:49:25 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:50.493 04:49:25 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:50.493 04:49:25 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:50.493 04:49:25 -- common/autotest_common.sh@871 -- # break 00:05:50.493 04:49:25 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:50.493 04:49:25 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:50.493 04:49:25 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:50.493 1+0 records in 00:05:50.493 1+0 records out 00:05:50.493 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000396948 s, 10.3 MB/s 00:05:50.493 04:49:25 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:50.493 04:49:25 -- common/autotest_common.sh@884 -- # size=4096 00:05:50.493 04:49:25 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:50.493 04:49:25 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:50.493 04:49:25 -- common/autotest_common.sh@887 -- # return 0 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:50.493 04:49:25 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:50.751 { 00:05:50.751 "nbd_device": "/dev/nbd0", 00:05:50.751 "bdev_name": "Malloc0" 00:05:50.751 }, 00:05:50.751 { 00:05:50.751 "nbd_device": "/dev/nbd1", 00:05:50.751 "bdev_name": "Malloc1" 00:05:50.751 } 00:05:50.751 ]' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:50.751 { 00:05:50.751 "nbd_device": "/dev/nbd0", 00:05:50.751 "bdev_name": "Malloc0" 00:05:50.751 }, 00:05:50.751 { 00:05:50.751 "nbd_device": "/dev/nbd1", 00:05:50.751 "bdev_name": "Malloc1" 00:05:50.751 } 00:05:50.751 ]' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:50.751 /dev/nbd1' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:50.751 /dev/nbd1' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@65 -- # count=2 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@95 -- # count=2 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:50.751 256+0 records in 00:05:50.751 256+0 records out 00:05:50.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00449673 s, 233 MB/s 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:50.751 256+0 records in 00:05:50.751 256+0 records out 00:05:50.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0194129 s, 54.0 MB/s 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:50.751 256+0 records in 00:05:50.751 256+0 records out 00:05:50.751 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0207363 s, 50.6 MB/s 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:50.751 04:49:25 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@51 -- # local i 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:50.752 04:49:25 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@41 -- # break 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@45 -- # return 0 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:51.010 04:49:26 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@41 -- # break 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@45 -- # return 0 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:51.268 04:49:26 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:51.526 04:49:26 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@65 -- # true 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@65 -- # count=0 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@104 -- # count=0 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:51.527 04:49:26 -- bdev/nbd_common.sh@109 -- # return 0 00:05:51.527 04:49:26 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:51.785 04:49:26 -- event/event.sh@35 -- # sleep 3 00:05:51.785 [2024-11-08 04:49:26.854320] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:52.044 [2024-11-08 04:49:26.918426] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:52.044 [2024-11-08 04:49:26.918427] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.044 [2024-11-08 04:49:26.959781] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:52.044 [2024-11-08 04:49:26.959822] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:54.573 04:49:29 -- event/event.sh@23 -- # for i in {0..2} 00:05:54.573 04:49:29 -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:54.573 spdk_app_start Round 2 00:05:54.573 04:49:29 -- event/event.sh@25 -- # waitforlisten 3658902 /var/tmp/spdk-nbd.sock 00:05:54.573 04:49:29 -- common/autotest_common.sh@829 -- # '[' -z 3658902 ']' 00:05:54.573 04:49:29 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:54.573 04:49:29 -- common/autotest_common.sh@834 -- # local max_retries=100 00:05:54.573 04:49:29 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:54.573 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:54.573 04:49:29 -- common/autotest_common.sh@838 -- # xtrace_disable 00:05:54.573 04:49:29 -- common/autotest_common.sh@10 -- # set +x 00:05:54.830 04:49:29 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:05:54.830 04:49:29 -- common/autotest_common.sh@862 -- # return 0 00:05:54.830 04:49:29 -- event/event.sh@27 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.088 Malloc0 00:05:55.088 04:49:30 -- event/event.sh@28 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:55.346 Malloc1 00:05:55.346 04:49:30 -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@12 -- # local i 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:55.346 /dev/nbd0 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:55.346 04:49:30 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:55.346 04:49:30 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:05:55.346 04:49:30 -- common/autotest_common.sh@867 -- # local i 00:05:55.347 04:49:30 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:55.347 04:49:30 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:55.347 04:49:30 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:05:55.347 04:49:30 -- common/autotest_common.sh@871 -- # break 00:05:55.347 04:49:30 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:55.347 04:49:30 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:55.347 04:49:30 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:55.347 1+0 records in 00:05:55.347 1+0 records out 00:05:55.347 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225171 s, 18.2 MB/s 00:05:55.347 04:49:30 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.605 04:49:30 -- common/autotest_common.sh@884 -- # size=4096 00:05:55.605 04:49:30 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.605 04:49:30 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:55.605 04:49:30 -- common/autotest_common.sh@887 -- # return 0 00:05:55.605 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:55.605 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.605 04:49:30 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:55.605 /dev/nbd1 00:05:55.605 04:49:30 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:55.605 04:49:30 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:55.605 04:49:30 -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:05:55.605 04:49:30 -- common/autotest_common.sh@867 -- # local i 00:05:55.605 04:49:30 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:05:55.605 04:49:30 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:05:55.605 04:49:30 -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:05:55.605 04:49:30 -- common/autotest_common.sh@871 -- # break 00:05:55.605 04:49:30 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:05:55.605 04:49:30 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:05:55.605 04:49:30 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:55.605 1+0 records in 00:05:55.605 1+0 records out 00:05:55.605 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230065 s, 17.8 MB/s 00:05:55.605 04:49:30 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.605 04:49:30 -- common/autotest_common.sh@884 -- # size=4096 00:05:55.606 04:49:30 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdtest 00:05:55.606 04:49:30 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:05:55.606 04:49:30 -- common/autotest_common.sh@887 -- # return 0 00:05:55.606 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:55.606 04:49:30 -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:55.606 04:49:30 -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:55.606 04:49:30 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:55.606 04:49:30 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:55.865 { 00:05:55.865 "nbd_device": "/dev/nbd0", 00:05:55.865 "bdev_name": "Malloc0" 00:05:55.865 }, 00:05:55.865 { 00:05:55.865 "nbd_device": "/dev/nbd1", 00:05:55.865 "bdev_name": "Malloc1" 00:05:55.865 } 00:05:55.865 ]' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:55.865 { 00:05:55.865 "nbd_device": "/dev/nbd0", 00:05:55.865 "bdev_name": "Malloc0" 00:05:55.865 }, 00:05:55.865 { 00:05:55.865 "nbd_device": "/dev/nbd1", 00:05:55.865 "bdev_name": "Malloc1" 00:05:55.865 } 00:05:55.865 ]' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:55.865 /dev/nbd1' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:55.865 /dev/nbd1' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@65 -- # count=2 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@66 -- # echo 2 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@95 -- # count=2 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:55.865 256+0 records in 00:05:55.865 256+0 records out 00:05:55.865 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0113096 s, 92.7 MB/s 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:55.865 256+0 records in 00:05:55.865 256+0 records out 00:05:55.865 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0195964 s, 53.5 MB/s 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:55.865 256+0 records in 00:05:55.865 256+0 records out 00:05:55.865 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0208676 s, 50.2 MB/s 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:55.865 04:49:30 -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/nbdrandtest 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@51 -- # local i 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.125 04:49:30 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@41 -- # break 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:56.125 04:49:31 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@41 -- # break 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@45 -- # return 0 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:56.384 04:49:31 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:56.642 04:49:31 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:56.642 04:49:31 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@65 -- # echo '' 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@65 -- # true 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@65 -- # count=0 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@66 -- # echo 0 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@104 -- # count=0 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:56.643 04:49:31 -- bdev/nbd_common.sh@109 -- # return 0 00:05:56.643 04:49:31 -- event/event.sh@34 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:56.902 04:49:31 -- event/event.sh@35 -- # sleep 3 00:05:56.902 [2024-11-08 04:49:31.987630] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:57.161 [2024-11-08 04:49:32.051533] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:05:57.161 [2024-11-08 04:49:32.051534] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:05:57.161 [2024-11-08 04:49:32.091615] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:57.161 [2024-11-08 04:49:32.091656] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:00.451 04:49:34 -- event/event.sh@38 -- # waitforlisten 3658902 /var/tmp/spdk-nbd.sock 00:06:00.451 04:49:34 -- common/autotest_common.sh@829 -- # '[' -z 3658902 ']' 00:06:00.451 04:49:34 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:00.451 04:49:34 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.451 04:49:34 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:00.451 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:00.451 04:49:34 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.451 04:49:34 -- common/autotest_common.sh@10 -- # set +x 00:06:00.451 04:49:34 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:00.451 04:49:34 -- common/autotest_common.sh@862 -- # return 0 00:06:00.451 04:49:34 -- event/event.sh@39 -- # killprocess 3658902 00:06:00.451 04:49:34 -- common/autotest_common.sh@936 -- # '[' -z 3658902 ']' 00:06:00.451 04:49:34 -- common/autotest_common.sh@940 -- # kill -0 3658902 00:06:00.451 04:49:34 -- common/autotest_common.sh@941 -- # uname 00:06:00.451 04:49:34 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:00.451 04:49:34 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3658902 00:06:00.451 04:49:35 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:00.451 04:49:35 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:00.451 04:49:35 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3658902' 00:06:00.451 killing process with pid 3658902 00:06:00.451 04:49:35 -- common/autotest_common.sh@955 -- # kill 3658902 00:06:00.451 04:49:35 -- common/autotest_common.sh@960 -- # wait 3658902 00:06:00.451 spdk_app_start is called in Round 0. 00:06:00.451 Shutdown signal received, stop current app iteration 00:06:00.451 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:00.451 spdk_app_start is called in Round 1. 00:06:00.451 Shutdown signal received, stop current app iteration 00:06:00.451 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:00.451 spdk_app_start is called in Round 2. 00:06:00.451 Shutdown signal received, stop current app iteration 00:06:00.451 Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 reinitialization... 00:06:00.451 spdk_app_start is called in Round 3. 00:06:00.451 Shutdown signal received, stop current app iteration 00:06:00.451 04:49:35 -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:00.451 04:49:35 -- event/event.sh@42 -- # return 0 00:06:00.451 00:06:00.451 real 0m16.453s 00:06:00.451 user 0m35.084s 00:06:00.451 sys 0m3.029s 00:06:00.451 04:49:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:00.451 04:49:35 -- common/autotest_common.sh@10 -- # set +x 00:06:00.451 ************************************ 00:06:00.451 END TEST app_repeat 00:06:00.451 ************************************ 00:06:00.451 04:49:35 -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:00.451 04:49:35 -- event/event.sh@55 -- # run_test cpu_locks /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:00.451 04:49:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.451 04:49:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.451 04:49:35 -- common/autotest_common.sh@10 -- # set +x 00:06:00.451 ************************************ 00:06:00.451 START TEST cpu_locks 00:06:00.451 ************************************ 00:06:00.451 04:49:35 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event/cpu_locks.sh 00:06:00.451 * Looking for test storage... 00:06:00.451 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/event 00:06:00.451 04:49:35 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:00.451 04:49:35 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:00.451 04:49:35 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:00.451 04:49:35 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:00.452 04:49:35 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:00.452 04:49:35 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:00.452 04:49:35 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:00.452 04:49:35 -- scripts/common.sh@335 -- # IFS=.-: 00:06:00.452 04:49:35 -- scripts/common.sh@335 -- # read -ra ver1 00:06:00.452 04:49:35 -- scripts/common.sh@336 -- # IFS=.-: 00:06:00.452 04:49:35 -- scripts/common.sh@336 -- # read -ra ver2 00:06:00.452 04:49:35 -- scripts/common.sh@337 -- # local 'op=<' 00:06:00.452 04:49:35 -- scripts/common.sh@339 -- # ver1_l=2 00:06:00.452 04:49:35 -- scripts/common.sh@340 -- # ver2_l=1 00:06:00.452 04:49:35 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:00.452 04:49:35 -- scripts/common.sh@343 -- # case "$op" in 00:06:00.452 04:49:35 -- scripts/common.sh@344 -- # : 1 00:06:00.452 04:49:35 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:00.452 04:49:35 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:00.452 04:49:35 -- scripts/common.sh@364 -- # decimal 1 00:06:00.452 04:49:35 -- scripts/common.sh@352 -- # local d=1 00:06:00.452 04:49:35 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:00.452 04:49:35 -- scripts/common.sh@354 -- # echo 1 00:06:00.452 04:49:35 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:00.452 04:49:35 -- scripts/common.sh@365 -- # decimal 2 00:06:00.452 04:49:35 -- scripts/common.sh@352 -- # local d=2 00:06:00.452 04:49:35 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:00.452 04:49:35 -- scripts/common.sh@354 -- # echo 2 00:06:00.452 04:49:35 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:00.452 04:49:35 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:00.452 04:49:35 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:00.452 04:49:35 -- scripts/common.sh@367 -- # return 0 00:06:00.452 04:49:35 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:00.452 04:49:35 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:00.452 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.452 --rc genhtml_branch_coverage=1 00:06:00.452 --rc genhtml_function_coverage=1 00:06:00.452 --rc genhtml_legend=1 00:06:00.452 --rc geninfo_all_blocks=1 00:06:00.452 --rc geninfo_unexecuted_blocks=1 00:06:00.452 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.452 ' 00:06:00.452 04:49:35 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:00.452 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.452 --rc genhtml_branch_coverage=1 00:06:00.452 --rc genhtml_function_coverage=1 00:06:00.452 --rc genhtml_legend=1 00:06:00.452 --rc geninfo_all_blocks=1 00:06:00.452 --rc geninfo_unexecuted_blocks=1 00:06:00.452 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.452 ' 00:06:00.452 04:49:35 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:00.452 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.452 --rc genhtml_branch_coverage=1 00:06:00.452 --rc genhtml_function_coverage=1 00:06:00.452 --rc genhtml_legend=1 00:06:00.452 --rc geninfo_all_blocks=1 00:06:00.452 --rc geninfo_unexecuted_blocks=1 00:06:00.452 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.452 ' 00:06:00.452 04:49:35 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:00.452 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:00.452 --rc genhtml_branch_coverage=1 00:06:00.452 --rc genhtml_function_coverage=1 00:06:00.452 --rc genhtml_legend=1 00:06:00.452 --rc geninfo_all_blocks=1 00:06:00.452 --rc geninfo_unexecuted_blocks=1 00:06:00.452 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:00.452 ' 00:06:00.452 04:49:35 -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:06:00.452 04:49:35 -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:06:00.452 04:49:35 -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:06:00.452 04:49:35 -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:06:00.452 04:49:35 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:00.452 04:49:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:00.452 04:49:35 -- common/autotest_common.sh@10 -- # set +x 00:06:00.452 ************************************ 00:06:00.452 START TEST default_locks 00:06:00.452 ************************************ 00:06:00.452 04:49:35 -- common/autotest_common.sh@1114 -- # default_locks 00:06:00.452 04:49:35 -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=3662100 00:06:00.452 04:49:35 -- event/cpu_locks.sh@47 -- # waitforlisten 3662100 00:06:00.452 04:49:35 -- common/autotest_common.sh@829 -- # '[' -z 3662100 ']' 00:06:00.452 04:49:35 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:00.452 04:49:35 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:00.452 04:49:35 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:00.452 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:00.452 04:49:35 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:00.452 04:49:35 -- common/autotest_common.sh@10 -- # set +x 00:06:00.452 04:49:35 -- event/cpu_locks.sh@45 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:00.452 [2024-11-08 04:49:35.479060] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:00.452 [2024-11-08 04:49:35.479125] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3662100 ] 00:06:00.452 EAL: No free 2048 kB hugepages reported on node 1 00:06:00.452 [2024-11-08 04:49:35.545369] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:00.710 [2024-11-08 04:49:35.623010] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:00.711 [2024-11-08 04:49:35.623115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.277 04:49:36 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:01.277 04:49:36 -- common/autotest_common.sh@862 -- # return 0 00:06:01.277 04:49:36 -- event/cpu_locks.sh@49 -- # locks_exist 3662100 00:06:01.277 04:49:36 -- event/cpu_locks.sh@22 -- # lslocks -p 3662100 00:06:01.277 04:49:36 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:01.843 lslocks: write error 00:06:01.843 04:49:36 -- event/cpu_locks.sh@50 -- # killprocess 3662100 00:06:01.843 04:49:36 -- common/autotest_common.sh@936 -- # '[' -z 3662100 ']' 00:06:01.843 04:49:36 -- common/autotest_common.sh@940 -- # kill -0 3662100 00:06:01.843 04:49:36 -- common/autotest_common.sh@941 -- # uname 00:06:01.843 04:49:36 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:01.843 04:49:36 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3662100 00:06:02.102 04:49:36 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:02.102 04:49:36 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:02.102 04:49:36 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3662100' 00:06:02.102 killing process with pid 3662100 00:06:02.102 04:49:36 -- common/autotest_common.sh@955 -- # kill 3662100 00:06:02.102 04:49:36 -- common/autotest_common.sh@960 -- # wait 3662100 00:06:02.360 04:49:37 -- event/cpu_locks.sh@52 -- # NOT waitforlisten 3662100 00:06:02.360 04:49:37 -- common/autotest_common.sh@650 -- # local es=0 00:06:02.360 04:49:37 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3662100 00:06:02.360 04:49:37 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:02.360 04:49:37 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:02.360 04:49:37 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:02.360 04:49:37 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:02.360 04:49:37 -- common/autotest_common.sh@653 -- # waitforlisten 3662100 00:06:02.360 04:49:37 -- common/autotest_common.sh@829 -- # '[' -z 3662100 ']' 00:06:02.360 04:49:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.360 04:49:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:02.360 04:49:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.360 04:49:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:02.360 04:49:37 -- common/autotest_common.sh@10 -- # set +x 00:06:02.360 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3662100) - No such process 00:06:02.360 ERROR: process (pid: 3662100) is no longer running 00:06:02.360 04:49:37 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:02.360 04:49:37 -- common/autotest_common.sh@862 -- # return 1 00:06:02.360 04:49:37 -- common/autotest_common.sh@653 -- # es=1 00:06:02.360 04:49:37 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:02.360 04:49:37 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:02.360 04:49:37 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:02.360 04:49:37 -- event/cpu_locks.sh@54 -- # no_locks 00:06:02.360 04:49:37 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:02.360 04:49:37 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:02.360 04:49:37 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:02.360 00:06:02.360 real 0m1.805s 00:06:02.360 user 0m1.926s 00:06:02.360 sys 0m0.600s 00:06:02.360 04:49:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:02.360 04:49:37 -- common/autotest_common.sh@10 -- # set +x 00:06:02.360 ************************************ 00:06:02.360 END TEST default_locks 00:06:02.360 ************************************ 00:06:02.360 04:49:37 -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:06:02.360 04:49:37 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:02.360 04:49:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:02.360 04:49:37 -- common/autotest_common.sh@10 -- # set +x 00:06:02.360 ************************************ 00:06:02.360 START TEST default_locks_via_rpc 00:06:02.360 ************************************ 00:06:02.360 04:49:37 -- common/autotest_common.sh@1114 -- # default_locks_via_rpc 00:06:02.360 04:49:37 -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=3662409 00:06:02.360 04:49:37 -- event/cpu_locks.sh@63 -- # waitforlisten 3662409 00:06:02.360 04:49:37 -- common/autotest_common.sh@829 -- # '[' -z 3662409 ']' 00:06:02.360 04:49:37 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.360 04:49:37 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:02.360 04:49:37 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.360 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.360 04:49:37 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:02.360 04:49:37 -- common/autotest_common.sh@10 -- # set +x 00:06:02.360 04:49:37 -- event/cpu_locks.sh@61 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:02.360 [2024-11-08 04:49:37.329564] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:02.360 [2024-11-08 04:49:37.329653] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3662409 ] 00:06:02.360 EAL: No free 2048 kB hugepages reported on node 1 00:06:02.360 [2024-11-08 04:49:37.396774] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.618 [2024-11-08 04:49:37.471655] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:02.618 [2024-11-08 04:49:37.471757] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.184 04:49:38 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:03.184 04:49:38 -- common/autotest_common.sh@862 -- # return 0 00:06:03.184 04:49:38 -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:06:03.184 04:49:38 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:03.184 04:49:38 -- common/autotest_common.sh@10 -- # set +x 00:06:03.184 04:49:38 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:03.184 04:49:38 -- event/cpu_locks.sh@67 -- # no_locks 00:06:03.184 04:49:38 -- event/cpu_locks.sh@26 -- # lock_files=() 00:06:03.184 04:49:38 -- event/cpu_locks.sh@26 -- # local lock_files 00:06:03.184 04:49:38 -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:06:03.184 04:49:38 -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:06:03.184 04:49:38 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:03.184 04:49:38 -- common/autotest_common.sh@10 -- # set +x 00:06:03.184 04:49:38 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:03.184 04:49:38 -- event/cpu_locks.sh@71 -- # locks_exist 3662409 00:06:03.184 04:49:38 -- event/cpu_locks.sh@22 -- # lslocks -p 3662409 00:06:03.184 04:49:38 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:03.750 04:49:38 -- event/cpu_locks.sh@73 -- # killprocess 3662409 00:06:03.750 04:49:38 -- common/autotest_common.sh@936 -- # '[' -z 3662409 ']' 00:06:03.750 04:49:38 -- common/autotest_common.sh@940 -- # kill -0 3662409 00:06:03.750 04:49:38 -- common/autotest_common.sh@941 -- # uname 00:06:03.750 04:49:38 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:03.750 04:49:38 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3662409 00:06:03.750 04:49:38 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:03.750 04:49:38 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:03.750 04:49:38 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3662409' 00:06:03.750 killing process with pid 3662409 00:06:03.750 04:49:38 -- common/autotest_common.sh@955 -- # kill 3662409 00:06:03.750 04:49:38 -- common/autotest_common.sh@960 -- # wait 3662409 00:06:04.009 00:06:04.009 real 0m1.722s 00:06:04.009 user 0m1.810s 00:06:04.009 sys 0m0.602s 00:06:04.009 04:49:39 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:04.009 04:49:39 -- common/autotest_common.sh@10 -- # set +x 00:06:04.009 ************************************ 00:06:04.009 END TEST default_locks_via_rpc 00:06:04.009 ************************************ 00:06:04.009 04:49:39 -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:06:04.009 04:49:39 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:04.009 04:49:39 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:04.009 04:49:39 -- common/autotest_common.sh@10 -- # set +x 00:06:04.009 ************************************ 00:06:04.009 START TEST non_locking_app_on_locked_coremask 00:06:04.009 ************************************ 00:06:04.009 04:49:39 -- common/autotest_common.sh@1114 -- # non_locking_app_on_locked_coremask 00:06:04.009 04:49:39 -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=3662719 00:06:04.009 04:49:39 -- event/cpu_locks.sh@81 -- # waitforlisten 3662719 /var/tmp/spdk.sock 00:06:04.009 04:49:39 -- event/cpu_locks.sh@79 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:04.009 04:49:39 -- common/autotest_common.sh@829 -- # '[' -z 3662719 ']' 00:06:04.009 04:49:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.009 04:49:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.009 04:49:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.009 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.009 04:49:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.009 04:49:39 -- common/autotest_common.sh@10 -- # set +x 00:06:04.009 [2024-11-08 04:49:39.089537] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:04.009 [2024-11-08 04:49:39.089630] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3662719 ] 00:06:04.267 EAL: No free 2048 kB hugepages reported on node 1 00:06:04.267 [2024-11-08 04:49:39.158012] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.267 [2024-11-08 04:49:39.234353] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:04.267 [2024-11-08 04:49:39.234460] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.833 04:49:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:04.833 04:49:39 -- common/autotest_common.sh@862 -- # return 0 00:06:04.833 04:49:39 -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=3662971 00:06:04.833 04:49:39 -- event/cpu_locks.sh@85 -- # waitforlisten 3662971 /var/tmp/spdk2.sock 00:06:04.833 04:49:39 -- event/cpu_locks.sh@83 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:06:04.833 04:49:39 -- common/autotest_common.sh@829 -- # '[' -z 3662971 ']' 00:06:04.833 04:49:39 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:04.833 04:49:39 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:04.833 04:49:39 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:04.833 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:04.833 04:49:39 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:04.833 04:49:39 -- common/autotest_common.sh@10 -- # set +x 00:06:04.833 [2024-11-08 04:49:39.940648] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:04.833 [2024-11-08 04:49:39.940712] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3662971 ] 00:06:05.090 EAL: No free 2048 kB hugepages reported on node 1 00:06:05.090 [2024-11-08 04:49:40.030520] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:05.090 [2024-11-08 04:49:40.034567] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.090 [2024-11-08 04:49:40.195057] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:05.090 [2024-11-08 04:49:40.195164] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.024 04:49:40 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:06.024 04:49:40 -- common/autotest_common.sh@862 -- # return 0 00:06:06.024 04:49:40 -- event/cpu_locks.sh@87 -- # locks_exist 3662719 00:06:06.024 04:49:40 -- event/cpu_locks.sh@22 -- # lslocks -p 3662719 00:06:06.024 04:49:40 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:06.621 lslocks: write error 00:06:06.621 04:49:41 -- event/cpu_locks.sh@89 -- # killprocess 3662719 00:06:06.621 04:49:41 -- common/autotest_common.sh@936 -- # '[' -z 3662719 ']' 00:06:06.621 04:49:41 -- common/autotest_common.sh@940 -- # kill -0 3662719 00:06:06.621 04:49:41 -- common/autotest_common.sh@941 -- # uname 00:06:06.621 04:49:41 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:06.621 04:49:41 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3662719 00:06:06.907 04:49:41 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:06.907 04:49:41 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:06.907 04:49:41 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3662719' 00:06:06.907 killing process with pid 3662719 00:06:06.907 04:49:41 -- common/autotest_common.sh@955 -- # kill 3662719 00:06:06.907 04:49:41 -- common/autotest_common.sh@960 -- # wait 3662719 00:06:07.473 04:49:42 -- event/cpu_locks.sh@90 -- # killprocess 3662971 00:06:07.473 04:49:42 -- common/autotest_common.sh@936 -- # '[' -z 3662971 ']' 00:06:07.473 04:49:42 -- common/autotest_common.sh@940 -- # kill -0 3662971 00:06:07.473 04:49:42 -- common/autotest_common.sh@941 -- # uname 00:06:07.473 04:49:42 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:07.473 04:49:42 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3662971 00:06:07.473 04:49:42 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:07.473 04:49:42 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:07.473 04:49:42 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3662971' 00:06:07.473 killing process with pid 3662971 00:06:07.473 04:49:42 -- common/autotest_common.sh@955 -- # kill 3662971 00:06:07.473 04:49:42 -- common/autotest_common.sh@960 -- # wait 3662971 00:06:07.731 00:06:07.731 real 0m3.668s 00:06:07.731 user 0m3.936s 00:06:07.731 sys 0m1.189s 00:06:07.731 04:49:42 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:07.731 04:49:42 -- common/autotest_common.sh@10 -- # set +x 00:06:07.731 ************************************ 00:06:07.731 END TEST non_locking_app_on_locked_coremask 00:06:07.731 ************************************ 00:06:07.731 04:49:42 -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:06:07.731 04:49:42 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:07.731 04:49:42 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:07.731 04:49:42 -- common/autotest_common.sh@10 -- # set +x 00:06:07.731 ************************************ 00:06:07.731 START TEST locking_app_on_unlocked_coremask 00:06:07.731 ************************************ 00:06:07.731 04:49:42 -- common/autotest_common.sh@1114 -- # locking_app_on_unlocked_coremask 00:06:07.731 04:49:42 -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=3663546 00:06:07.731 04:49:42 -- event/cpu_locks.sh@99 -- # waitforlisten 3663546 /var/tmp/spdk.sock 00:06:07.731 04:49:42 -- event/cpu_locks.sh@97 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:06:07.731 04:49:42 -- common/autotest_common.sh@829 -- # '[' -z 3663546 ']' 00:06:07.731 04:49:42 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:07.731 04:49:42 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:07.731 04:49:42 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:07.731 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:07.731 04:49:42 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:07.731 04:49:42 -- common/autotest_common.sh@10 -- # set +x 00:06:07.731 [2024-11-08 04:49:42.806641] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:07.731 [2024-11-08 04:49:42.806715] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3663546 ] 00:06:07.731 EAL: No free 2048 kB hugepages reported on node 1 00:06:07.990 [2024-11-08 04:49:42.872974] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:07.990 [2024-11-08 04:49:42.873003] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.990 [2024-11-08 04:49:42.947310] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:07.990 [2024-11-08 04:49:42.947418] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.556 04:49:43 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:08.556 04:49:43 -- common/autotest_common.sh@862 -- # return 0 00:06:08.556 04:49:43 -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=3663568 00:06:08.556 04:49:43 -- event/cpu_locks.sh@103 -- # waitforlisten 3663568 /var/tmp/spdk2.sock 00:06:08.556 04:49:43 -- event/cpu_locks.sh@101 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:08.556 04:49:43 -- common/autotest_common.sh@829 -- # '[' -z 3663568 ']' 00:06:08.556 04:49:43 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:08.556 04:49:43 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:08.556 04:49:43 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:08.556 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:08.556 04:49:43 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:08.556 04:49:43 -- common/autotest_common.sh@10 -- # set +x 00:06:08.556 [2024-11-08 04:49:43.656543] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:08.556 [2024-11-08 04:49:43.656609] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3663568 ] 00:06:08.814 EAL: No free 2048 kB hugepages reported on node 1 00:06:08.814 [2024-11-08 04:49:43.750535] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.814 [2024-11-08 04:49:43.888120] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:08.814 [2024-11-08 04:49:43.888229] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.380 04:49:44 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:09.380 04:49:44 -- common/autotest_common.sh@862 -- # return 0 00:06:09.380 04:49:44 -- event/cpu_locks.sh@105 -- # locks_exist 3663568 00:06:09.380 04:49:44 -- event/cpu_locks.sh@22 -- # lslocks -p 3663568 00:06:09.380 04:49:44 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:10.754 lslocks: write error 00:06:10.754 04:49:45 -- event/cpu_locks.sh@107 -- # killprocess 3663546 00:06:10.754 04:49:45 -- common/autotest_common.sh@936 -- # '[' -z 3663546 ']' 00:06:10.754 04:49:45 -- common/autotest_common.sh@940 -- # kill -0 3663546 00:06:10.754 04:49:45 -- common/autotest_common.sh@941 -- # uname 00:06:10.754 04:49:45 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:10.754 04:49:45 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3663546 00:06:10.754 04:49:45 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:10.754 04:49:45 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:10.754 04:49:45 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3663546' 00:06:10.754 killing process with pid 3663546 00:06:10.754 04:49:45 -- common/autotest_common.sh@955 -- # kill 3663546 00:06:10.754 04:49:45 -- common/autotest_common.sh@960 -- # wait 3663546 00:06:11.320 04:49:46 -- event/cpu_locks.sh@108 -- # killprocess 3663568 00:06:11.320 04:49:46 -- common/autotest_common.sh@936 -- # '[' -z 3663568 ']' 00:06:11.320 04:49:46 -- common/autotest_common.sh@940 -- # kill -0 3663568 00:06:11.320 04:49:46 -- common/autotest_common.sh@941 -- # uname 00:06:11.320 04:49:46 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:11.320 04:49:46 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3663568 00:06:11.320 04:49:46 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:11.320 04:49:46 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:11.320 04:49:46 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3663568' 00:06:11.320 killing process with pid 3663568 00:06:11.320 04:49:46 -- common/autotest_common.sh@955 -- # kill 3663568 00:06:11.320 04:49:46 -- common/autotest_common.sh@960 -- # wait 3663568 00:06:11.578 00:06:11.578 real 0m3.742s 00:06:11.578 user 0m3.990s 00:06:11.578 sys 0m1.260s 00:06:11.578 04:49:46 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:11.578 04:49:46 -- common/autotest_common.sh@10 -- # set +x 00:06:11.578 ************************************ 00:06:11.578 END TEST locking_app_on_unlocked_coremask 00:06:11.578 ************************************ 00:06:11.578 04:49:46 -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:06:11.578 04:49:46 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:11.578 04:49:46 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:11.578 04:49:46 -- common/autotest_common.sh@10 -- # set +x 00:06:11.578 ************************************ 00:06:11.578 START TEST locking_app_on_locked_coremask 00:06:11.578 ************************************ 00:06:11.578 04:49:46 -- common/autotest_common.sh@1114 -- # locking_app_on_locked_coremask 00:06:11.578 04:49:46 -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=3664136 00:06:11.579 04:49:46 -- event/cpu_locks.sh@116 -- # waitforlisten 3664136 /var/tmp/spdk.sock 00:06:11.579 04:49:46 -- event/cpu_locks.sh@114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:11.579 04:49:46 -- common/autotest_common.sh@829 -- # '[' -z 3664136 ']' 00:06:11.579 04:49:46 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.579 04:49:46 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:11.579 04:49:46 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.579 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.579 04:49:46 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:11.579 04:49:46 -- common/autotest_common.sh@10 -- # set +x 00:06:11.579 [2024-11-08 04:49:46.599521] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:11.579 [2024-11-08 04:49:46.599624] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3664136 ] 00:06:11.579 EAL: No free 2048 kB hugepages reported on node 1 00:06:11.579 [2024-11-08 04:49:46.666333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:11.837 [2024-11-08 04:49:46.731680] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:11.837 [2024-11-08 04:49:46.731789] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:12.403 04:49:47 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:12.403 04:49:47 -- common/autotest_common.sh@862 -- # return 0 00:06:12.403 04:49:47 -- event/cpu_locks.sh@118 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:06:12.403 04:49:47 -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=3664406 00:06:12.403 04:49:47 -- event/cpu_locks.sh@120 -- # NOT waitforlisten 3664406 /var/tmp/spdk2.sock 00:06:12.403 04:49:47 -- common/autotest_common.sh@650 -- # local es=0 00:06:12.403 04:49:47 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3664406 /var/tmp/spdk2.sock 00:06:12.403 04:49:47 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:12.403 04:49:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:12.403 04:49:47 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:12.403 04:49:47 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:12.403 04:49:47 -- common/autotest_common.sh@653 -- # waitforlisten 3664406 /var/tmp/spdk2.sock 00:06:12.403 04:49:47 -- common/autotest_common.sh@829 -- # '[' -z 3664406 ']' 00:06:12.403 04:49:47 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:12.403 04:49:47 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:12.403 04:49:47 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:12.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:12.403 04:49:47 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:12.403 04:49:47 -- common/autotest_common.sh@10 -- # set +x 00:06:12.403 [2024-11-08 04:49:47.459278] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:12.403 [2024-11-08 04:49:47.459365] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3664406 ] 00:06:12.403 EAL: No free 2048 kB hugepages reported on node 1 00:06:12.661 [2024-11-08 04:49:47.552582] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 3664136 has claimed it. 00:06:12.661 [2024-11-08 04:49:47.552621] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:13.226 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3664406) - No such process 00:06:13.226 ERROR: process (pid: 3664406) is no longer running 00:06:13.226 04:49:48 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:13.226 04:49:48 -- common/autotest_common.sh@862 -- # return 1 00:06:13.226 04:49:48 -- common/autotest_common.sh@653 -- # es=1 00:06:13.226 04:49:48 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:13.226 04:49:48 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:13.227 04:49:48 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:13.227 04:49:48 -- event/cpu_locks.sh@122 -- # locks_exist 3664136 00:06:13.227 04:49:48 -- event/cpu_locks.sh@22 -- # lslocks -p 3664136 00:06:13.227 04:49:48 -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:06:13.485 lslocks: write error 00:06:13.485 04:49:48 -- event/cpu_locks.sh@124 -- # killprocess 3664136 00:06:13.485 04:49:48 -- common/autotest_common.sh@936 -- # '[' -z 3664136 ']' 00:06:13.485 04:49:48 -- common/autotest_common.sh@940 -- # kill -0 3664136 00:06:13.485 04:49:48 -- common/autotest_common.sh@941 -- # uname 00:06:13.485 04:49:48 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:13.485 04:49:48 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3664136 00:06:13.485 04:49:48 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:13.485 04:49:48 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:13.485 04:49:48 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3664136' 00:06:13.485 killing process with pid 3664136 00:06:13.485 04:49:48 -- common/autotest_common.sh@955 -- # kill 3664136 00:06:13.485 04:49:48 -- common/autotest_common.sh@960 -- # wait 3664136 00:06:13.743 00:06:13.743 real 0m2.274s 00:06:13.743 user 0m2.474s 00:06:13.743 sys 0m0.676s 00:06:13.743 04:49:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:13.743 04:49:48 -- common/autotest_common.sh@10 -- # set +x 00:06:13.743 ************************************ 00:06:13.743 END TEST locking_app_on_locked_coremask 00:06:13.743 ************************************ 00:06:14.002 04:49:48 -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:06:14.002 04:49:48 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:14.002 04:49:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:14.002 04:49:48 -- common/autotest_common.sh@10 -- # set +x 00:06:14.002 ************************************ 00:06:14.002 START TEST locking_overlapped_coremask 00:06:14.002 ************************************ 00:06:14.002 04:49:48 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask 00:06:14.002 04:49:48 -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=3664698 00:06:14.002 04:49:48 -- event/cpu_locks.sh@133 -- # waitforlisten 3664698 /var/tmp/spdk.sock 00:06:14.002 04:49:48 -- event/cpu_locks.sh@131 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 00:06:14.002 04:49:48 -- common/autotest_common.sh@829 -- # '[' -z 3664698 ']' 00:06:14.002 04:49:48 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:14.002 04:49:48 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.002 04:49:48 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:14.002 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:14.002 04:49:48 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.002 04:49:48 -- common/autotest_common.sh@10 -- # set +x 00:06:14.002 [2024-11-08 04:49:48.922729] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.002 [2024-11-08 04:49:48.922808] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3664698 ] 00:06:14.002 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.002 [2024-11-08 04:49:48.990080] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:14.002 [2024-11-08 04:49:49.054826] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:14.002 [2024-11-08 04:49:49.055015] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:14.002 [2024-11-08 04:49:49.055113] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:14.002 [2024-11-08 04:49:49.055115] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:14.936 04:49:49 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:14.936 04:49:49 -- common/autotest_common.sh@862 -- # return 0 00:06:14.936 04:49:49 -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=3664737 00:06:14.936 04:49:49 -- event/cpu_locks.sh@137 -- # NOT waitforlisten 3664737 /var/tmp/spdk2.sock 00:06:14.936 04:49:49 -- event/cpu_locks.sh@135 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:14.936 04:49:49 -- common/autotest_common.sh@650 -- # local es=0 00:06:14.936 04:49:49 -- common/autotest_common.sh@652 -- # valid_exec_arg waitforlisten 3664737 /var/tmp/spdk2.sock 00:06:14.936 04:49:49 -- common/autotest_common.sh@638 -- # local arg=waitforlisten 00:06:14.936 04:49:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.936 04:49:49 -- common/autotest_common.sh@642 -- # type -t waitforlisten 00:06:14.936 04:49:49 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:14.936 04:49:49 -- common/autotest_common.sh@653 -- # waitforlisten 3664737 /var/tmp/spdk2.sock 00:06:14.936 04:49:49 -- common/autotest_common.sh@829 -- # '[' -z 3664737 ']' 00:06:14.936 04:49:49 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:14.936 04:49:49 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:14.936 04:49:49 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:14.936 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:14.936 04:49:49 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:14.936 04:49:49 -- common/autotest_common.sh@10 -- # set +x 00:06:14.936 [2024-11-08 04:49:49.778222] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:14.936 [2024-11-08 04:49:49.778309] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3664737 ] 00:06:14.936 EAL: No free 2048 kB hugepages reported on node 1 00:06:14.936 [2024-11-08 04:49:49.873391] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3664698 has claimed it. 00:06:14.936 [2024-11-08 04:49:49.873437] app.c: 791:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:15.501 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 844: kill: (3664737) - No such process 00:06:15.501 ERROR: process (pid: 3664737) is no longer running 00:06:15.501 04:49:50 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:15.501 04:49:50 -- common/autotest_common.sh@862 -- # return 1 00:06:15.501 04:49:50 -- common/autotest_common.sh@653 -- # es=1 00:06:15.501 04:49:50 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:15.501 04:49:50 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:15.501 04:49:50 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:15.501 04:49:50 -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:15.501 04:49:50 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:15.501 04:49:50 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:15.501 04:49:50 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:15.501 04:49:50 -- event/cpu_locks.sh@141 -- # killprocess 3664698 00:06:15.501 04:49:50 -- common/autotest_common.sh@936 -- # '[' -z 3664698 ']' 00:06:15.501 04:49:50 -- common/autotest_common.sh@940 -- # kill -0 3664698 00:06:15.501 04:49:50 -- common/autotest_common.sh@941 -- # uname 00:06:15.501 04:49:50 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:15.501 04:49:50 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3664698 00:06:15.501 04:49:50 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:15.501 04:49:50 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:15.502 04:49:50 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3664698' 00:06:15.502 killing process with pid 3664698 00:06:15.502 04:49:50 -- common/autotest_common.sh@955 -- # kill 3664698 00:06:15.502 04:49:50 -- common/autotest_common.sh@960 -- # wait 3664698 00:06:15.760 00:06:15.760 real 0m1.914s 00:06:15.760 user 0m5.482s 00:06:15.760 sys 0m0.427s 00:06:15.760 04:49:50 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:15.760 04:49:50 -- common/autotest_common.sh@10 -- # set +x 00:06:15.760 ************************************ 00:06:15.760 END TEST locking_overlapped_coremask 00:06:15.760 ************************************ 00:06:15.760 04:49:50 -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:15.760 04:49:50 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:15.760 04:49:50 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:15.761 04:49:50 -- common/autotest_common.sh@10 -- # set +x 00:06:15.761 ************************************ 00:06:15.761 START TEST locking_overlapped_coremask_via_rpc 00:06:15.761 ************************************ 00:06:15.761 04:49:50 -- common/autotest_common.sh@1114 -- # locking_overlapped_coremask_via_rpc 00:06:15.761 04:49:50 -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=3665015 00:06:15.761 04:49:50 -- event/cpu_locks.sh@149 -- # waitforlisten 3665015 /var/tmp/spdk.sock 00:06:15.761 04:49:50 -- event/cpu_locks.sh@147 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:15.761 04:49:50 -- common/autotest_common.sh@829 -- # '[' -z 3665015 ']' 00:06:15.761 04:49:50 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:15.761 04:49:50 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:15.761 04:49:50 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:15.761 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:15.761 04:49:50 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:15.761 04:49:50 -- common/autotest_common.sh@10 -- # set +x 00:06:16.019 [2024-11-08 04:49:50.888826] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.019 [2024-11-08 04:49:50.888920] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3665015 ] 00:06:16.019 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.019 [2024-11-08 04:49:50.957650] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.019 [2024-11-08 04:49:50.957683] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:16.019 [2024-11-08 04:49:51.022868] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.019 [2024-11-08 04:49:51.023046] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:16.019 [2024-11-08 04:49:51.023143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:16.019 [2024-11-08 04:49:51.023143] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:16.953 04:49:51 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:16.953 04:49:51 -- common/autotest_common.sh@862 -- # return 0 00:06:16.953 04:49:51 -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=3665260 00:06:16.953 04:49:51 -- event/cpu_locks.sh@153 -- # waitforlisten 3665260 /var/tmp/spdk2.sock 00:06:16.953 04:49:51 -- event/cpu_locks.sh@151 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:16.953 04:49:51 -- common/autotest_common.sh@829 -- # '[' -z 3665260 ']' 00:06:16.953 04:49:51 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:16.953 04:49:51 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:16.953 04:49:51 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:16.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:16.953 04:49:51 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:16.953 04:49:51 -- common/autotest_common.sh@10 -- # set +x 00:06:16.953 [2024-11-08 04:49:51.742557] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:16.953 [2024-11-08 04:49:51.742620] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3665260 ] 00:06:16.953 EAL: No free 2048 kB hugepages reported on node 1 00:06:16.953 [2024-11-08 04:49:51.837246] app.c: 795:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:16.953 [2024-11-08 04:49:51.837281] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:16.953 [2024-11-08 04:49:51.976340] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:16.953 [2024-11-08 04:49:51.976489] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:06:16.953 [2024-11-08 04:49:51.979576] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:06:16.953 [2024-11-08 04:49:51.979579] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 4 00:06:17.519 04:49:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.519 04:49:52 -- common/autotest_common.sh@862 -- # return 0 00:06:17.519 04:49:52 -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:17.519 04:49:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:17.519 04:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:17.519 04:49:52 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:17.519 04:49:52 -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.519 04:49:52 -- common/autotest_common.sh@650 -- # local es=0 00:06:17.519 04:49:52 -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.519 04:49:52 -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:17.519 04:49:52 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:17.519 04:49:52 -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:17.519 04:49:52 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:17.519 04:49:52 -- common/autotest_common.sh@653 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:17.519 04:49:52 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:17.519 04:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:17.519 [2024-11-08 04:49:52.612586] app.c: 666:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 3665015 has claimed it. 00:06:17.519 request: 00:06:17.519 { 00:06:17.519 "method": "framework_enable_cpumask_locks", 00:06:17.519 "req_id": 1 00:06:17.519 } 00:06:17.519 Got JSON-RPC error response 00:06:17.519 response: 00:06:17.519 { 00:06:17.519 "code": -32603, 00:06:17.519 "message": "Failed to claim CPU core: 2" 00:06:17.519 } 00:06:17.519 04:49:52 -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:17.519 04:49:52 -- common/autotest_common.sh@653 -- # es=1 00:06:17.519 04:49:52 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:17.519 04:49:52 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:17.519 04:49:52 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:17.519 04:49:52 -- event/cpu_locks.sh@158 -- # waitforlisten 3665015 /var/tmp/spdk.sock 00:06:17.519 04:49:52 -- common/autotest_common.sh@829 -- # '[' -z 3665015 ']' 00:06:17.519 04:49:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:17.519 04:49:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:17.519 04:49:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:17.519 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:17.519 04:49:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:17.519 04:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:17.777 04:49:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:17.777 04:49:52 -- common/autotest_common.sh@862 -- # return 0 00:06:17.777 04:49:52 -- event/cpu_locks.sh@159 -- # waitforlisten 3665260 /var/tmp/spdk2.sock 00:06:17.777 04:49:52 -- common/autotest_common.sh@829 -- # '[' -z 3665260 ']' 00:06:17.777 04:49:52 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:17.777 04:49:52 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:17.777 04:49:52 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:17.777 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:17.777 04:49:52 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:17.777 04:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:18.035 04:49:52 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:18.035 04:49:52 -- common/autotest_common.sh@862 -- # return 0 00:06:18.035 04:49:52 -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:18.035 04:49:52 -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:18.035 04:49:52 -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:18.035 04:49:52 -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:18.035 00:06:18.035 real 0m2.136s 00:06:18.035 user 0m0.869s 00:06:18.035 sys 0m0.190s 00:06:18.035 04:49:52 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.035 04:49:52 -- common/autotest_common.sh@10 -- # set +x 00:06:18.035 ************************************ 00:06:18.035 END TEST locking_overlapped_coremask_via_rpc 00:06:18.035 ************************************ 00:06:18.035 04:49:53 -- event/cpu_locks.sh@174 -- # cleanup 00:06:18.035 04:49:53 -- event/cpu_locks.sh@15 -- # [[ -z 3665015 ]] 00:06:18.035 04:49:53 -- event/cpu_locks.sh@15 -- # killprocess 3665015 00:06:18.035 04:49:53 -- common/autotest_common.sh@936 -- # '[' -z 3665015 ']' 00:06:18.035 04:49:53 -- common/autotest_common.sh@940 -- # kill -0 3665015 00:06:18.035 04:49:53 -- common/autotest_common.sh@941 -- # uname 00:06:18.035 04:49:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.035 04:49:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3665015 00:06:18.035 04:49:53 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:18.035 04:49:53 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:18.035 04:49:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3665015' 00:06:18.035 killing process with pid 3665015 00:06:18.035 04:49:53 -- common/autotest_common.sh@955 -- # kill 3665015 00:06:18.035 04:49:53 -- common/autotest_common.sh@960 -- # wait 3665015 00:06:18.604 04:49:53 -- event/cpu_locks.sh@16 -- # [[ -z 3665260 ]] 00:06:18.604 04:49:53 -- event/cpu_locks.sh@16 -- # killprocess 3665260 00:06:18.604 04:49:53 -- common/autotest_common.sh@936 -- # '[' -z 3665260 ']' 00:06:18.604 04:49:53 -- common/autotest_common.sh@940 -- # kill -0 3665260 00:06:18.604 04:49:53 -- common/autotest_common.sh@941 -- # uname 00:06:18.604 04:49:53 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:18.604 04:49:53 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3665260 00:06:18.604 04:49:53 -- common/autotest_common.sh@942 -- # process_name=reactor_2 00:06:18.604 04:49:53 -- common/autotest_common.sh@946 -- # '[' reactor_2 = sudo ']' 00:06:18.604 04:49:53 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3665260' 00:06:18.604 killing process with pid 3665260 00:06:18.604 04:49:53 -- common/autotest_common.sh@955 -- # kill 3665260 00:06:18.604 04:49:53 -- common/autotest_common.sh@960 -- # wait 3665260 00:06:18.862 04:49:53 -- event/cpu_locks.sh@18 -- # rm -f 00:06:18.862 04:49:53 -- event/cpu_locks.sh@1 -- # cleanup 00:06:18.862 04:49:53 -- event/cpu_locks.sh@15 -- # [[ -z 3665015 ]] 00:06:18.862 04:49:53 -- event/cpu_locks.sh@15 -- # killprocess 3665015 00:06:18.862 04:49:53 -- common/autotest_common.sh@936 -- # '[' -z 3665015 ']' 00:06:18.862 04:49:53 -- common/autotest_common.sh@940 -- # kill -0 3665015 00:06:18.862 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (3665015) - No such process 00:06:18.862 04:49:53 -- common/autotest_common.sh@963 -- # echo 'Process with pid 3665015 is not found' 00:06:18.862 Process with pid 3665015 is not found 00:06:18.862 04:49:53 -- event/cpu_locks.sh@16 -- # [[ -z 3665260 ]] 00:06:18.862 04:49:53 -- event/cpu_locks.sh@16 -- # killprocess 3665260 00:06:18.862 04:49:53 -- common/autotest_common.sh@936 -- # '[' -z 3665260 ']' 00:06:18.862 04:49:53 -- common/autotest_common.sh@940 -- # kill -0 3665260 00:06:18.862 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh: line 940: kill: (3665260) - No such process 00:06:18.862 04:49:53 -- common/autotest_common.sh@963 -- # echo 'Process with pid 3665260 is not found' 00:06:18.862 Process with pid 3665260 is not found 00:06:18.862 04:49:53 -- event/cpu_locks.sh@18 -- # rm -f 00:06:18.862 00:06:18.862 real 0m18.527s 00:06:18.862 user 0m31.379s 00:06:18.862 sys 0m5.888s 00:06:18.862 04:49:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.862 04:49:53 -- common/autotest_common.sh@10 -- # set +x 00:06:18.862 ************************************ 00:06:18.862 END TEST cpu_locks 00:06:18.862 ************************************ 00:06:18.862 00:06:18.862 real 0m44.287s 00:06:18.862 user 1m23.610s 00:06:18.862 sys 0m9.980s 00:06:18.862 04:49:53 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:18.862 04:49:53 -- common/autotest_common.sh@10 -- # set +x 00:06:18.862 ************************************ 00:06:18.862 END TEST event 00:06:18.862 ************************************ 00:06:18.862 04:49:53 -- spdk/autotest.sh@175 -- # run_test thread /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:18.862 04:49:53 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:18.862 04:49:53 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:18.862 04:49:53 -- common/autotest_common.sh@10 -- # set +x 00:06:18.862 ************************************ 00:06:18.862 START TEST thread 00:06:18.862 ************************************ 00:06:18.862 04:49:53 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/thread.sh 00:06:18.862 * Looking for test storage... 00:06:19.121 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread 00:06:19.121 04:49:53 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:19.121 04:49:53 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:19.121 04:49:53 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:19.121 04:49:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:19.121 04:49:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:19.121 04:49:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:19.121 04:49:54 -- scripts/common.sh@335 -- # IFS=.-: 00:06:19.121 04:49:54 -- scripts/common.sh@335 -- # read -ra ver1 00:06:19.121 04:49:54 -- scripts/common.sh@336 -- # IFS=.-: 00:06:19.121 04:49:54 -- scripts/common.sh@336 -- # read -ra ver2 00:06:19.121 04:49:54 -- scripts/common.sh@337 -- # local 'op=<' 00:06:19.121 04:49:54 -- scripts/common.sh@339 -- # ver1_l=2 00:06:19.121 04:49:54 -- scripts/common.sh@340 -- # ver2_l=1 00:06:19.121 04:49:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:19.121 04:49:54 -- scripts/common.sh@343 -- # case "$op" in 00:06:19.121 04:49:54 -- scripts/common.sh@344 -- # : 1 00:06:19.121 04:49:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:19.121 04:49:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:19.121 04:49:54 -- scripts/common.sh@364 -- # decimal 1 00:06:19.121 04:49:54 -- scripts/common.sh@352 -- # local d=1 00:06:19.121 04:49:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:19.121 04:49:54 -- scripts/common.sh@354 -- # echo 1 00:06:19.121 04:49:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:19.121 04:49:54 -- scripts/common.sh@365 -- # decimal 2 00:06:19.121 04:49:54 -- scripts/common.sh@352 -- # local d=2 00:06:19.121 04:49:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:19.121 04:49:54 -- scripts/common.sh@354 -- # echo 2 00:06:19.121 04:49:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:19.121 04:49:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:19.121 04:49:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:19.121 04:49:54 -- scripts/common.sh@367 -- # return 0 00:06:19.121 04:49:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:19.121 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.121 --rc genhtml_branch_coverage=1 00:06:19.121 --rc genhtml_function_coverage=1 00:06:19.121 --rc genhtml_legend=1 00:06:19.121 --rc geninfo_all_blocks=1 00:06:19.121 --rc geninfo_unexecuted_blocks=1 00:06:19.121 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.121 ' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:19.121 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.121 --rc genhtml_branch_coverage=1 00:06:19.121 --rc genhtml_function_coverage=1 00:06:19.121 --rc genhtml_legend=1 00:06:19.121 --rc geninfo_all_blocks=1 00:06:19.121 --rc geninfo_unexecuted_blocks=1 00:06:19.121 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.121 ' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:19.121 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.121 --rc genhtml_branch_coverage=1 00:06:19.121 --rc genhtml_function_coverage=1 00:06:19.121 --rc genhtml_legend=1 00:06:19.121 --rc geninfo_all_blocks=1 00:06:19.121 --rc geninfo_unexecuted_blocks=1 00:06:19.121 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.121 ' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:19.121 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:19.121 --rc genhtml_branch_coverage=1 00:06:19.121 --rc genhtml_function_coverage=1 00:06:19.121 --rc genhtml_legend=1 00:06:19.121 --rc geninfo_all_blocks=1 00:06:19.121 --rc geninfo_unexecuted_blocks=1 00:06:19.121 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:19.121 ' 00:06:19.121 04:49:54 -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:19.121 04:49:54 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:19.121 04:49:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:19.121 04:49:54 -- common/autotest_common.sh@10 -- # set +x 00:06:19.121 ************************************ 00:06:19.121 START TEST thread_poller_perf 00:06:19.121 ************************************ 00:06:19.121 04:49:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:19.121 [2024-11-08 04:49:54.083258] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:19.121 [2024-11-08 04:49:54.083347] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3665658 ] 00:06:19.121 EAL: No free 2048 kB hugepages reported on node 1 00:06:19.121 [2024-11-08 04:49:54.152565] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:19.121 [2024-11-08 04:49:54.222413] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:19.121 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:20.495 [2024-11-08T03:49:55.605Z] ====================================== 00:06:20.495 [2024-11-08T03:49:55.605Z] busy:2504102822 (cyc) 00:06:20.495 [2024-11-08T03:49:55.605Z] total_run_count: 810000 00:06:20.495 [2024-11-08T03:49:55.605Z] tsc_hz: 2500000000 (cyc) 00:06:20.495 [2024-11-08T03:49:55.605Z] ====================================== 00:06:20.495 [2024-11-08T03:49:55.605Z] poller_cost: 3091 (cyc), 1236 (nsec) 00:06:20.495 00:06:20.495 real 0m1.221s 00:06:20.495 user 0m1.127s 00:06:20.495 sys 0m0.089s 00:06:20.495 04:49:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:20.495 04:49:55 -- common/autotest_common.sh@10 -- # set +x 00:06:20.495 ************************************ 00:06:20.495 END TEST thread_poller_perf 00:06:20.495 ************************************ 00:06:20.495 04:49:55 -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:20.495 04:49:55 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:06:20.495 04:49:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:20.495 04:49:55 -- common/autotest_common.sh@10 -- # set +x 00:06:20.495 ************************************ 00:06:20.495 START TEST thread_poller_perf 00:06:20.495 ************************************ 00:06:20.495 04:49:55 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:20.495 [2024-11-08 04:49:55.353211] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:20.495 [2024-11-08 04:49:55.353300] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3665944 ] 00:06:20.495 EAL: No free 2048 kB hugepages reported on node 1 00:06:20.495 [2024-11-08 04:49:55.421699] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:20.495 [2024-11-08 04:49:55.488003] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:20.495 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:21.868 [2024-11-08T03:49:56.978Z] ====================================== 00:06:21.868 [2024-11-08T03:49:56.978Z] busy:2501834162 (cyc) 00:06:21.868 [2024-11-08T03:49:56.978Z] total_run_count: 13635000 00:06:21.868 [2024-11-08T03:49:56.978Z] tsc_hz: 2500000000 (cyc) 00:06:21.868 [2024-11-08T03:49:56.978Z] ====================================== 00:06:21.868 [2024-11-08T03:49:56.978Z] poller_cost: 183 (cyc), 73 (nsec) 00:06:21.868 00:06:21.868 real 0m1.215s 00:06:21.868 user 0m1.124s 00:06:21.868 sys 0m0.086s 00:06:21.868 04:49:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:21.868 04:49:56 -- common/autotest_common.sh@10 -- # set +x 00:06:21.868 ************************************ 00:06:21.869 END TEST thread_poller_perf 00:06:21.869 ************************************ 00:06:21.869 04:49:56 -- thread/thread.sh@17 -- # [[ n != \y ]] 00:06:21.869 04:49:56 -- thread/thread.sh@18 -- # run_test thread_spdk_lock /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:21.869 04:49:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:21.869 04:49:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:21.869 04:49:56 -- common/autotest_common.sh@10 -- # set +x 00:06:21.869 ************************************ 00:06:21.869 START TEST thread_spdk_lock 00:06:21.869 ************************************ 00:06:21.869 04:49:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock 00:06:21.869 [2024-11-08 04:49:56.614222] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:21.869 [2024-11-08 04:49:56.614292] [ DPDK EAL parameters: spdk_lock_test --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666235 ] 00:06:21.869 EAL: No free 2048 kB hugepages reported on node 1 00:06:21.869 [2024-11-08 04:49:56.679134] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:21.869 [2024-11-08 04:49:56.746891] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:06:21.869 [2024-11-08 04:49:56.746893] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:22.435 [2024-11-08 04:49:57.237149] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 957:thread_execute_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.435 [2024-11-08 04:49:57.237196] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3064:spdk_spin_lock: *ERROR*: unrecoverable spinlock error 2: Deadlock detected (thread != sspin->thread) 00:06:22.435 [2024-11-08 04:49:57.237208] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:3019:sspin_stacks_print: *ERROR*: spinlock 0x1483c80 00:06:22.435 [2024-11-08 04:49:57.238105] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.435 [2024-11-08 04:49:57.238209] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1018:thread_execute_timed_poller: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.435 [2024-11-08 04:49:57.238227] /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c: 852:msg_queue_run_batch: *ERROR*: unrecoverable spinlock error 7: Lock(s) held while SPDK thread going off CPU (thread->lock_count == 0) 00:06:22.435 Starting test contend 00:06:22.435 Worker Delay Wait us Hold us Total us 00:06:22.435 0 3 168261 184744 353006 00:06:22.435 1 5 85496 287005 372501 00:06:22.436 PASS test contend 00:06:22.436 Starting test hold_by_poller 00:06:22.436 PASS test hold_by_poller 00:06:22.436 Starting test hold_by_message 00:06:22.436 PASS test hold_by_message 00:06:22.436 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/thread/lock/spdk_lock summary: 00:06:22.436 100014 assertions passed 00:06:22.436 0 assertions failed 00:06:22.436 00:06:22.436 real 0m0.697s 00:06:22.436 user 0m1.102s 00:06:22.436 sys 0m0.083s 00:06:22.436 04:49:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:22.436 04:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:22.436 ************************************ 00:06:22.436 END TEST thread_spdk_lock 00:06:22.436 ************************************ 00:06:22.436 00:06:22.436 real 0m3.467s 00:06:22.436 user 0m3.500s 00:06:22.436 sys 0m0.486s 00:06:22.436 04:49:57 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:22.436 04:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:22.436 ************************************ 00:06:22.436 END TEST thread 00:06:22.436 ************************************ 00:06:22.436 04:49:57 -- spdk/autotest.sh@176 -- # run_test accel /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:22.436 04:49:57 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:06:22.436 04:49:57 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:22.436 04:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:22.436 ************************************ 00:06:22.436 START TEST accel 00:06:22.436 ************************************ 00:06:22.436 04:49:57 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel.sh 00:06:22.436 * Looking for test storage... 00:06:22.436 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:06:22.436 04:49:57 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:06:22.436 04:49:57 -- common/autotest_common.sh@1690 -- # lcov --version 00:06:22.436 04:49:57 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:06:22.436 04:49:57 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:06:22.436 04:49:57 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:06:22.436 04:49:57 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:06:22.436 04:49:57 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:06:22.436 04:49:57 -- scripts/common.sh@335 -- # IFS=.-: 00:06:22.436 04:49:57 -- scripts/common.sh@335 -- # read -ra ver1 00:06:22.436 04:49:57 -- scripts/common.sh@336 -- # IFS=.-: 00:06:22.436 04:49:57 -- scripts/common.sh@336 -- # read -ra ver2 00:06:22.436 04:49:57 -- scripts/common.sh@337 -- # local 'op=<' 00:06:22.436 04:49:57 -- scripts/common.sh@339 -- # ver1_l=2 00:06:22.436 04:49:57 -- scripts/common.sh@340 -- # ver2_l=1 00:06:22.436 04:49:57 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:06:22.436 04:49:57 -- scripts/common.sh@343 -- # case "$op" in 00:06:22.436 04:49:57 -- scripts/common.sh@344 -- # : 1 00:06:22.436 04:49:57 -- scripts/common.sh@363 -- # (( v = 0 )) 00:06:22.436 04:49:57 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:22.436 04:49:57 -- scripts/common.sh@364 -- # decimal 1 00:06:22.436 04:49:57 -- scripts/common.sh@352 -- # local d=1 00:06:22.436 04:49:57 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:22.436 04:49:57 -- scripts/common.sh@354 -- # echo 1 00:06:22.436 04:49:57 -- scripts/common.sh@364 -- # ver1[v]=1 00:06:22.436 04:49:57 -- scripts/common.sh@365 -- # decimal 2 00:06:22.694 04:49:57 -- scripts/common.sh@352 -- # local d=2 00:06:22.694 04:49:57 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:22.694 04:49:57 -- scripts/common.sh@354 -- # echo 2 00:06:22.694 04:49:57 -- scripts/common.sh@365 -- # ver2[v]=2 00:06:22.694 04:49:57 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:06:22.694 04:49:57 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:06:22.694 04:49:57 -- scripts/common.sh@367 -- # return 0 00:06:22.694 04:49:57 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:22.694 04:49:57 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:06:22.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:22.694 --rc genhtml_branch_coverage=1 00:06:22.694 --rc genhtml_function_coverage=1 00:06:22.694 --rc genhtml_legend=1 00:06:22.694 --rc geninfo_all_blocks=1 00:06:22.694 --rc geninfo_unexecuted_blocks=1 00:06:22.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:22.694 ' 00:06:22.694 04:49:57 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:06:22.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:22.694 --rc genhtml_branch_coverage=1 00:06:22.694 --rc genhtml_function_coverage=1 00:06:22.694 --rc genhtml_legend=1 00:06:22.694 --rc geninfo_all_blocks=1 00:06:22.694 --rc geninfo_unexecuted_blocks=1 00:06:22.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:22.694 ' 00:06:22.694 04:49:57 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:06:22.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:22.694 --rc genhtml_branch_coverage=1 00:06:22.694 --rc genhtml_function_coverage=1 00:06:22.694 --rc genhtml_legend=1 00:06:22.694 --rc geninfo_all_blocks=1 00:06:22.694 --rc geninfo_unexecuted_blocks=1 00:06:22.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:22.694 ' 00:06:22.694 04:49:57 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:06:22.694 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:22.694 --rc genhtml_branch_coverage=1 00:06:22.694 --rc genhtml_function_coverage=1 00:06:22.694 --rc genhtml_legend=1 00:06:22.694 --rc geninfo_all_blocks=1 00:06:22.694 --rc geninfo_unexecuted_blocks=1 00:06:22.694 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:06:22.694 ' 00:06:22.694 04:49:57 -- accel/accel.sh@73 -- # declare -A expected_opcs 00:06:22.694 04:49:57 -- accel/accel.sh@74 -- # get_expected_opcs 00:06:22.695 04:49:57 -- accel/accel.sh@57 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:22.695 04:49:57 -- accel/accel.sh@59 -- # spdk_tgt_pid=3666409 00:06:22.695 04:49:57 -- accel/accel.sh@58 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:22.695 04:49:57 -- accel/accel.sh@60 -- # waitforlisten 3666409 00:06:22.695 04:49:57 -- common/autotest_common.sh@829 -- # '[' -z 3666409 ']' 00:06:22.695 04:49:57 -- accel/accel.sh@58 -- # build_accel_config 00:06:22.695 04:49:57 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:22.695 04:49:57 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:22.695 04:49:57 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:22.695 04:49:57 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:22.695 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:22.695 04:49:57 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:22.695 04:49:57 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:22.695 04:49:57 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:22.695 04:49:57 -- common/autotest_common.sh@10 -- # set +x 00:06:22.695 04:49:57 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:22.695 04:49:57 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:22.695 04:49:57 -- accel/accel.sh@41 -- # local IFS=, 00:06:22.695 04:49:57 -- accel/accel.sh@42 -- # jq -r . 00:06:22.695 [2024-11-08 04:49:57.565006] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:22.695 [2024-11-08 04:49:57.565057] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666409 ] 00:06:22.695 EAL: No free 2048 kB hugepages reported on node 1 00:06:22.695 [2024-11-08 04:49:57.630792] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:22.695 [2024-11-08 04:49:57.706153] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:06:22.695 [2024-11-08 04:49:57.706257] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.629 04:49:58 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:23.629 04:49:58 -- common/autotest_common.sh@862 -- # return 0 00:06:23.629 04:49:58 -- accel/accel.sh@62 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:23.629 04:49:58 -- accel/accel.sh@62 -- # rpc_cmd accel_get_opc_assignments 00:06:23.629 04:49:58 -- accel/accel.sh@62 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:23.629 04:49:58 -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:23.629 04:49:58 -- common/autotest_common.sh@10 -- # set +x 00:06:23.629 04:49:58 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@63 -- # for opc_opt in "${exp_opcs[@]}" 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # IFS== 00:06:23.629 04:49:58 -- accel/accel.sh@64 -- # read -r opc module 00:06:23.629 04:49:58 -- accel/accel.sh@65 -- # expected_opcs["$opc"]=software 00:06:23.629 04:49:58 -- accel/accel.sh@67 -- # killprocess 3666409 00:06:23.629 04:49:58 -- common/autotest_common.sh@936 -- # '[' -z 3666409 ']' 00:06:23.629 04:49:58 -- common/autotest_common.sh@940 -- # kill -0 3666409 00:06:23.629 04:49:58 -- common/autotest_common.sh@941 -- # uname 00:06:23.629 04:49:58 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:06:23.629 04:49:58 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3666409 00:06:23.629 04:49:58 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:06:23.629 04:49:58 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:06:23.629 04:49:58 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3666409' 00:06:23.629 killing process with pid 3666409 00:06:23.629 04:49:58 -- common/autotest_common.sh@955 -- # kill 3666409 00:06:23.629 04:49:58 -- common/autotest_common.sh@960 -- # wait 3666409 00:06:23.887 04:49:58 -- accel/accel.sh@68 -- # trap - ERR 00:06:23.888 04:49:58 -- accel/accel.sh@81 -- # run_test accel_help accel_perf -h 00:06:23.888 04:49:58 -- common/autotest_common.sh@1087 -- # '[' 3 -le 1 ']' 00:06:23.888 04:49:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:23.888 04:49:58 -- common/autotest_common.sh@10 -- # set +x 00:06:23.888 04:49:58 -- common/autotest_common.sh@1114 -- # accel_perf -h 00:06:23.888 04:49:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:23.888 04:49:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.888 04:49:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.888 04:49:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.888 04:49:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.888 04:49:58 -- accel/accel.sh@42 -- # jq -r . 00:06:23.888 04:49:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:23.888 04:49:58 -- common/autotest_common.sh@10 -- # set +x 00:06:23.888 04:49:58 -- accel/accel.sh@83 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:23.888 04:49:58 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:23.888 04:49:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:23.888 04:49:58 -- common/autotest_common.sh@10 -- # set +x 00:06:23.888 ************************************ 00:06:23.888 START TEST accel_missing_filename 00:06:23.888 ************************************ 00:06:23.888 04:49:58 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress 00:06:23.888 04:49:58 -- common/autotest_common.sh@650 -- # local es=0 00:06:23.888 04:49:58 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:23.888 04:49:58 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:23.888 04:49:58 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:23.888 04:49:58 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:23.888 04:49:58 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:23.888 04:49:58 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:06:23.888 04:49:58 -- accel/accel.sh@12 -- # build_accel_config 00:06:23.888 04:49:58 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:23.888 04:49:58 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:23.888 04:49:58 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:23.888 04:49:58 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:23.888 04:49:58 -- accel/accel.sh@41 -- # local IFS=, 00:06:23.888 04:49:58 -- accel/accel.sh@42 -- # jq -r . 00:06:23.888 [2024-11-08 04:49:58.912687] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:23.888 [2024-11-08 04:49:58.912778] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666631 ] 00:06:23.888 EAL: No free 2048 kB hugepages reported on node 1 00:06:23.888 [2024-11-08 04:49:58.981652] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.146 [2024-11-08 04:49:59.050817] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.146 [2024-11-08 04:49:59.090454] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:24.146 [2024-11-08 04:49:59.150432] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:24.146 A filename is required. 00:06:24.146 04:49:59 -- common/autotest_common.sh@653 -- # es=234 00:06:24.146 04:49:59 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.146 04:49:59 -- common/autotest_common.sh@662 -- # es=106 00:06:24.146 04:49:59 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:24.146 04:49:59 -- common/autotest_common.sh@670 -- # es=1 00:06:24.146 04:49:59 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.146 00:06:24.146 real 0m0.329s 00:06:24.146 user 0m0.227s 00:06:24.146 sys 0m0.133s 00:06:24.146 04:49:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.146 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.146 ************************************ 00:06:24.146 END TEST accel_missing_filename 00:06:24.146 ************************************ 00:06:24.146 04:49:59 -- accel/accel.sh@85 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.146 04:49:59 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:24.146 04:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.146 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.404 ************************************ 00:06:24.404 START TEST accel_compress_verify 00:06:24.404 ************************************ 00:06:24.404 04:49:59 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.404 04:49:59 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.404 04:49:59 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.404 04:49:59 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:24.404 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.404 04:49:59 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:24.404 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.404 04:49:59 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.404 04:49:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:06:24.404 04:49:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.404 04:49:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.404 04:49:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.404 04:49:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.404 04:49:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.404 04:49:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.404 04:49:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.404 04:49:59 -- accel/accel.sh@42 -- # jq -r . 00:06:24.404 [2024-11-08 04:49:59.285179] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.404 [2024-11-08 04:49:59.285266] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666829 ] 00:06:24.404 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.404 [2024-11-08 04:49:59.361333] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.404 [2024-11-08 04:49:59.449088] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.404 [2024-11-08 04:49:59.496097] app.c: 910:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:24.663 [2024-11-08 04:49:59.560326] accel_perf.c:1385:main: *ERROR*: ERROR starting application 00:06:24.663 00:06:24.663 Compression does not support the verify option, aborting. 00:06:24.663 04:49:59 -- common/autotest_common.sh@653 -- # es=161 00:06:24.663 04:49:59 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.663 04:49:59 -- common/autotest_common.sh@662 -- # es=33 00:06:24.663 04:49:59 -- common/autotest_common.sh@663 -- # case "$es" in 00:06:24.663 04:49:59 -- common/autotest_common.sh@670 -- # es=1 00:06:24.663 04:49:59 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.663 00:06:24.663 real 0m0.396s 00:06:24.664 user 0m0.284s 00:06:24.664 sys 0m0.151s 00:06:24.664 04:49:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.664 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.664 ************************************ 00:06:24.664 END TEST accel_compress_verify 00:06:24.664 ************************************ 00:06:24.664 04:49:59 -- accel/accel.sh@87 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:24.664 04:49:59 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:24.664 04:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.664 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.664 ************************************ 00:06:24.664 START TEST accel_wrong_workload 00:06:24.664 ************************************ 00:06:24.664 04:49:59 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w foobar 00:06:24.664 04:49:59 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.664 04:49:59 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:24.664 04:49:59 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:24.664 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.664 04:49:59 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:24.664 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.664 04:49:59 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:06:24.664 04:49:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:24.664 04:49:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.664 04:49:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.664 04:49:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.664 04:49:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.664 04:49:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.664 04:49:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.664 04:49:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.664 04:49:59 -- accel/accel.sh@42 -- # jq -r . 00:06:24.664 Unsupported workload type: foobar 00:06:24.664 [2024-11-08 04:49:59.727885] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:24.664 accel_perf options: 00:06:24.664 [-h help message] 00:06:24.664 [-q queue depth per core] 00:06:24.664 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:24.664 [-T number of threads per core 00:06:24.664 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:24.664 [-t time in seconds] 00:06:24.664 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:24.664 [ dif_verify, , dif_generate, dif_generate_copy 00:06:24.664 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:24.664 [-l for compress/decompress workloads, name of uncompressed input file 00:06:24.664 [-S for crc32c workload, use this seed value (default 0) 00:06:24.664 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:24.664 [-f for fill workload, use this BYTE value (default 255) 00:06:24.664 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:24.664 [-y verify result if this switch is on] 00:06:24.664 [-a tasks to allocate per core (default: same value as -q)] 00:06:24.664 Can be used to spread operations across a wider range of memory. 00:06:24.664 04:49:59 -- common/autotest_common.sh@653 -- # es=1 00:06:24.664 04:49:59 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.664 04:49:59 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:24.664 04:49:59 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.664 00:06:24.664 real 0m0.030s 00:06:24.664 user 0m0.011s 00:06:24.664 sys 0m0.018s 00:06:24.664 04:49:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.664 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.664 ************************************ 00:06:24.664 END TEST accel_wrong_workload 00:06:24.664 ************************************ 00:06:24.664 Error: writing output failed: Broken pipe 00:06:24.922 04:49:59 -- accel/accel.sh@89 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:24.922 04:49:59 -- common/autotest_common.sh@1087 -- # '[' 10 -le 1 ']' 00:06:24.922 04:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.922 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.922 ************************************ 00:06:24.922 START TEST accel_negative_buffers 00:06:24.922 ************************************ 00:06:24.922 04:49:59 -- common/autotest_common.sh@1114 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:24.922 04:49:59 -- common/autotest_common.sh@650 -- # local es=0 00:06:24.922 04:49:59 -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:24.922 04:49:59 -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:06:24.922 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.922 04:49:59 -- common/autotest_common.sh@642 -- # type -t accel_perf 00:06:24.922 04:49:59 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:24.922 04:49:59 -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:06:24.922 04:49:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:24.922 04:49:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.922 04:49:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.922 04:49:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.922 04:49:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.922 04:49:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.922 04:49:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.922 04:49:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.922 04:49:59 -- accel/accel.sh@42 -- # jq -r . 00:06:24.922 -x option must be non-negative. 00:06:24.922 [2024-11-08 04:49:59.801943] app.c:1292:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:24.922 accel_perf options: 00:06:24.922 [-h help message] 00:06:24.922 [-q queue depth per core] 00:06:24.922 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:24.922 [-T number of threads per core 00:06:24.922 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:24.922 [-t time in seconds] 00:06:24.922 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:24.922 [ dif_verify, , dif_generate, dif_generate_copy 00:06:24.922 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:24.922 [-l for compress/decompress workloads, name of uncompressed input file 00:06:24.922 [-S for crc32c workload, use this seed value (default 0) 00:06:24.922 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:24.922 [-f for fill workload, use this BYTE value (default 255) 00:06:24.922 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:24.922 [-y verify result if this switch is on] 00:06:24.922 [-a tasks to allocate per core (default: same value as -q)] 00:06:24.922 Can be used to spread operations across a wider range of memory. 00:06:24.922 04:49:59 -- common/autotest_common.sh@653 -- # es=1 00:06:24.922 04:49:59 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:24.922 04:49:59 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:24.923 04:49:59 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:24.923 00:06:24.923 real 0m0.028s 00:06:24.923 user 0m0.010s 00:06:24.923 sys 0m0.018s 00:06:24.923 04:49:59 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:24.923 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.923 ************************************ 00:06:24.923 END TEST accel_negative_buffers 00:06:24.923 ************************************ 00:06:24.923 Error: writing output failed: Broken pipe 00:06:24.923 04:49:59 -- accel/accel.sh@93 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:24.923 04:49:59 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:24.923 04:49:59 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:24.923 04:49:59 -- common/autotest_common.sh@10 -- # set +x 00:06:24.923 ************************************ 00:06:24.923 START TEST accel_crc32c 00:06:24.923 ************************************ 00:06:24.923 04:49:59 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:24.923 04:49:59 -- accel/accel.sh@16 -- # local accel_opc 00:06:24.923 04:49:59 -- accel/accel.sh@17 -- # local accel_module 00:06:24.923 04:49:59 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:24.923 04:49:59 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:24.923 04:49:59 -- accel/accel.sh@12 -- # build_accel_config 00:06:24.923 04:49:59 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:24.923 04:49:59 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:24.923 04:49:59 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:24.923 04:49:59 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:24.923 04:49:59 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:24.923 04:49:59 -- accel/accel.sh@41 -- # local IFS=, 00:06:24.923 04:49:59 -- accel/accel.sh@42 -- # jq -r . 00:06:24.923 [2024-11-08 04:49:59.874672] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:24.923 [2024-11-08 04:49:59.874754] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3666949 ] 00:06:24.923 EAL: No free 2048 kB hugepages reported on node 1 00:06:24.923 [2024-11-08 04:49:59.945357] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:24.923 [2024-11-08 04:50:00.022476] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.296 04:50:01 -- accel/accel.sh@18 -- # out=' 00:06:26.296 SPDK Configuration: 00:06:26.296 Core mask: 0x1 00:06:26.296 00:06:26.296 Accel Perf Configuration: 00:06:26.296 Workload Type: crc32c 00:06:26.296 CRC-32C seed: 32 00:06:26.296 Transfer size: 4096 bytes 00:06:26.296 Vector count 1 00:06:26.296 Module: software 00:06:26.296 Queue depth: 32 00:06:26.296 Allocate depth: 32 00:06:26.296 # threads/core: 1 00:06:26.296 Run time: 1 seconds 00:06:26.296 Verify: Yes 00:06:26.296 00:06:26.296 Running for 1 seconds... 00:06:26.296 00:06:26.296 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:26.296 ------------------------------------------------------------------------------------ 00:06:26.296 0,0 680992/s 2660 MiB/s 0 0 00:06:26.296 ==================================================================================== 00:06:26.296 Total 680992/s 2660 MiB/s 0 0' 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:26.296 04:50:01 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:26.296 04:50:01 -- accel/accel.sh@12 -- # build_accel_config 00:06:26.296 04:50:01 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:26.296 04:50:01 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:26.296 04:50:01 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:26.296 04:50:01 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:26.296 04:50:01 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:26.296 04:50:01 -- accel/accel.sh@41 -- # local IFS=, 00:06:26.296 04:50:01 -- accel/accel.sh@42 -- # jq -r . 00:06:26.296 [2024-11-08 04:50:01.215705] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:26.296 [2024-11-08 04:50:01.215797] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3667218 ] 00:06:26.296 EAL: No free 2048 kB hugepages reported on node 1 00:06:26.296 [2024-11-08 04:50:01.285089] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:26.296 [2024-11-08 04:50:01.350970] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=0x1 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=crc32c 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=32 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=software 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@23 -- # accel_module=software 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=32 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=32 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val=1 00:06:26.296 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.296 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.296 04:50:01 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:26.554 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.554 04:50:01 -- accel/accel.sh@21 -- # val=Yes 00:06:26.554 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.554 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.554 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:26.554 04:50:01 -- accel/accel.sh@21 -- # val= 00:06:26.554 04:50:01 -- accel/accel.sh@22 -- # case "$var" in 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # IFS=: 00:06:26.554 04:50:01 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@21 -- # val= 00:06:27.488 04:50:02 -- accel/accel.sh@22 -- # case "$var" in 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # IFS=: 00:06:27.488 04:50:02 -- accel/accel.sh@20 -- # read -r var val 00:06:27.488 04:50:02 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:27.488 04:50:02 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:27.488 04:50:02 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:27.488 00:06:27.488 real 0m2.695s 00:06:27.488 user 0m2.433s 00:06:27.488 sys 0m0.270s 00:06:27.488 04:50:02 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:27.488 04:50:02 -- common/autotest_common.sh@10 -- # set +x 00:06:27.488 ************************************ 00:06:27.488 END TEST accel_crc32c 00:06:27.488 ************************************ 00:06:27.488 04:50:02 -- accel/accel.sh@94 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:27.488 04:50:02 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:27.488 04:50:02 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:27.488 04:50:02 -- common/autotest_common.sh@10 -- # set +x 00:06:27.488 ************************************ 00:06:27.488 START TEST accel_crc32c_C2 00:06:27.488 ************************************ 00:06:27.488 04:50:02 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:27.488 04:50:02 -- accel/accel.sh@16 -- # local accel_opc 00:06:27.488 04:50:02 -- accel/accel.sh@17 -- # local accel_module 00:06:27.747 04:50:02 -- accel/accel.sh@18 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:27.747 04:50:02 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:27.747 04:50:02 -- accel/accel.sh@12 -- # build_accel_config 00:06:27.747 04:50:02 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:27.747 04:50:02 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:27.747 04:50:02 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:27.747 04:50:02 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:27.747 04:50:02 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:27.747 04:50:02 -- accel/accel.sh@41 -- # local IFS=, 00:06:27.748 04:50:02 -- accel/accel.sh@42 -- # jq -r . 00:06:27.748 [2024-11-08 04:50:02.608102] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:27.748 [2024-11-08 04:50:02.608154] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3667501 ] 00:06:27.748 EAL: No free 2048 kB hugepages reported on node 1 00:06:27.748 [2024-11-08 04:50:02.672504] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:27.748 [2024-11-08 04:50:02.740514] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.124 04:50:03 -- accel/accel.sh@18 -- # out=' 00:06:29.124 SPDK Configuration: 00:06:29.124 Core mask: 0x1 00:06:29.124 00:06:29.124 Accel Perf Configuration: 00:06:29.124 Workload Type: crc32c 00:06:29.124 CRC-32C seed: 0 00:06:29.124 Transfer size: 4096 bytes 00:06:29.124 Vector count 2 00:06:29.124 Module: software 00:06:29.124 Queue depth: 32 00:06:29.124 Allocate depth: 32 00:06:29.124 # threads/core: 1 00:06:29.124 Run time: 1 seconds 00:06:29.124 Verify: Yes 00:06:29.124 00:06:29.124 Running for 1 seconds... 00:06:29.124 00:06:29.124 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:29.124 ------------------------------------------------------------------------------------ 00:06:29.124 0,0 559104/s 4368 MiB/s 0 0 00:06:29.124 ==================================================================================== 00:06:29.124 Total 559104/s 2184 MiB/s 0 0' 00:06:29.124 04:50:03 -- accel/accel.sh@20 -- # IFS=: 00:06:29.124 04:50:03 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:29.124 04:50:03 -- accel/accel.sh@20 -- # read -r var val 00:06:29.124 04:50:03 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:29.124 04:50:03 -- accel/accel.sh@12 -- # build_accel_config 00:06:29.124 04:50:03 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:29.124 04:50:03 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:29.124 04:50:03 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:29.124 04:50:03 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:29.124 04:50:03 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:29.124 04:50:03 -- accel/accel.sh@41 -- # local IFS=, 00:06:29.124 04:50:03 -- accel/accel.sh@42 -- # jq -r . 00:06:29.124 [2024-11-08 04:50:03.955566] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:29.124 [2024-11-08 04:50:03.955630] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3667750 ] 00:06:29.124 EAL: No free 2048 kB hugepages reported on node 1 00:06:29.124 [2024-11-08 04:50:04.019778] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:29.124 [2024-11-08 04:50:04.086421] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.124 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.124 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.124 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.124 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=0x1 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=crc32c 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@24 -- # accel_opc=crc32c 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=0 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=software 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@23 -- # accel_module=software 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=32 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=32 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=1 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val=Yes 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:29.125 04:50:04 -- accel/accel.sh@21 -- # val= 00:06:29.125 04:50:04 -- accel/accel.sh@22 -- # case "$var" in 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # IFS=: 00:06:29.125 04:50:04 -- accel/accel.sh@20 -- # read -r var val 00:06:30.501 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.501 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.501 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.501 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.501 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.501 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.501 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.501 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.501 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.501 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.501 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.502 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.502 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.502 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.502 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.502 04:50:05 -- accel/accel.sh@21 -- # val= 00:06:30.502 04:50:05 -- accel/accel.sh@22 -- # case "$var" in 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # IFS=: 00:06:30.502 04:50:05 -- accel/accel.sh@20 -- # read -r var val 00:06:30.502 04:50:05 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:30.502 04:50:05 -- accel/accel.sh@28 -- # [[ -n crc32c ]] 00:06:30.502 04:50:05 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:30.502 00:06:30.502 real 0m2.665s 00:06:30.502 user 0m2.431s 00:06:30.502 sys 0m0.243s 00:06:30.502 04:50:05 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:30.502 04:50:05 -- common/autotest_common.sh@10 -- # set +x 00:06:30.502 ************************************ 00:06:30.502 END TEST accel_crc32c_C2 00:06:30.502 ************************************ 00:06:30.502 04:50:05 -- accel/accel.sh@95 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:30.502 04:50:05 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:30.502 04:50:05 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:30.502 04:50:05 -- common/autotest_common.sh@10 -- # set +x 00:06:30.502 ************************************ 00:06:30.502 START TEST accel_copy 00:06:30.502 ************************************ 00:06:30.502 04:50:05 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy -y 00:06:30.502 04:50:05 -- accel/accel.sh@16 -- # local accel_opc 00:06:30.502 04:50:05 -- accel/accel.sh@17 -- # local accel_module 00:06:30.502 04:50:05 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy -y 00:06:30.502 04:50:05 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:30.502 04:50:05 -- accel/accel.sh@12 -- # build_accel_config 00:06:30.502 04:50:05 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:30.502 04:50:05 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:30.502 04:50:05 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:30.502 04:50:05 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:30.502 04:50:05 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:30.502 04:50:05 -- accel/accel.sh@41 -- # local IFS=, 00:06:30.502 04:50:05 -- accel/accel.sh@42 -- # jq -r . 00:06:30.502 [2024-11-08 04:50:05.323875] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:30.502 [2024-11-08 04:50:05.323963] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3667944 ] 00:06:30.502 EAL: No free 2048 kB hugepages reported on node 1 00:06:30.502 [2024-11-08 04:50:05.393285] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:30.502 [2024-11-08 04:50:05.462766] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.878 04:50:06 -- accel/accel.sh@18 -- # out=' 00:06:31.878 SPDK Configuration: 00:06:31.878 Core mask: 0x1 00:06:31.878 00:06:31.878 Accel Perf Configuration: 00:06:31.878 Workload Type: copy 00:06:31.878 Transfer size: 4096 bytes 00:06:31.878 Vector count 1 00:06:31.878 Module: software 00:06:31.878 Queue depth: 32 00:06:31.878 Allocate depth: 32 00:06:31.878 # threads/core: 1 00:06:31.878 Run time: 1 seconds 00:06:31.878 Verify: Yes 00:06:31.878 00:06:31.878 Running for 1 seconds... 00:06:31.878 00:06:31.878 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:31.878 ------------------------------------------------------------------------------------ 00:06:31.878 0,0 551840/s 2155 MiB/s 0 0 00:06:31.878 ==================================================================================== 00:06:31.878 Total 551840/s 2155 MiB/s 0 0' 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:31.878 04:50:06 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:31.878 04:50:06 -- accel/accel.sh@12 -- # build_accel_config 00:06:31.878 04:50:06 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:31.878 04:50:06 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:31.878 04:50:06 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:31.878 04:50:06 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:31.878 04:50:06 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:31.878 04:50:06 -- accel/accel.sh@41 -- # local IFS=, 00:06:31.878 04:50:06 -- accel/accel.sh@42 -- # jq -r . 00:06:31.878 [2024-11-08 04:50:06.654845] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:31.878 [2024-11-08 04:50:06.654935] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3668112 ] 00:06:31.878 EAL: No free 2048 kB hugepages reported on node 1 00:06:31.878 [2024-11-08 04:50:06.726086] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:31.878 [2024-11-08 04:50:06.796339] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=0x1 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=copy 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@24 -- # accel_opc=copy 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=software 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@23 -- # accel_module=software 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=32 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=32 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=1 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val=Yes 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:31.878 04:50:06 -- accel/accel.sh@21 -- # val= 00:06:31.878 04:50:06 -- accel/accel.sh@22 -- # case "$var" in 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # IFS=: 00:06:31.878 04:50:06 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@21 -- # val= 00:06:33.320 04:50:07 -- accel/accel.sh@22 -- # case "$var" in 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # IFS=: 00:06:33.320 04:50:07 -- accel/accel.sh@20 -- # read -r var val 00:06:33.320 04:50:07 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:33.320 04:50:07 -- accel/accel.sh@28 -- # [[ -n copy ]] 00:06:33.320 04:50:07 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:33.320 00:06:33.320 real 0m2.669s 00:06:33.320 user 0m2.418s 00:06:33.320 sys 0m0.260s 00:06:33.320 04:50:07 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:33.320 04:50:07 -- common/autotest_common.sh@10 -- # set +x 00:06:33.320 ************************************ 00:06:33.320 END TEST accel_copy 00:06:33.320 ************************************ 00:06:33.320 04:50:08 -- accel/accel.sh@96 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.320 04:50:08 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:06:33.320 04:50:08 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:33.320 04:50:08 -- common/autotest_common.sh@10 -- # set +x 00:06:33.320 ************************************ 00:06:33.320 START TEST accel_fill 00:06:33.320 ************************************ 00:06:33.320 04:50:08 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.320 04:50:08 -- accel/accel.sh@16 -- # local accel_opc 00:06:33.320 04:50:08 -- accel/accel.sh@17 -- # local accel_module 00:06:33.320 04:50:08 -- accel/accel.sh@18 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.320 04:50:08 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:33.320 04:50:08 -- accel/accel.sh@12 -- # build_accel_config 00:06:33.320 04:50:08 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:33.320 04:50:08 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:33.320 04:50:08 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:33.320 04:50:08 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:33.320 04:50:08 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:33.320 04:50:08 -- accel/accel.sh@41 -- # local IFS=, 00:06:33.320 04:50:08 -- accel/accel.sh@42 -- # jq -r . 00:06:33.320 [2024-11-08 04:50:08.043917] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:33.320 [2024-11-08 04:50:08.044027] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3668370 ] 00:06:33.320 EAL: No free 2048 kB hugepages reported on node 1 00:06:33.320 [2024-11-08 04:50:08.116656] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.320 [2024-11-08 04:50:08.185202] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.255 04:50:09 -- accel/accel.sh@18 -- # out=' 00:06:34.255 SPDK Configuration: 00:06:34.255 Core mask: 0x1 00:06:34.255 00:06:34.255 Accel Perf Configuration: 00:06:34.255 Workload Type: fill 00:06:34.255 Fill pattern: 0x80 00:06:34.255 Transfer size: 4096 bytes 00:06:34.255 Vector count 1 00:06:34.255 Module: software 00:06:34.255 Queue depth: 64 00:06:34.255 Allocate depth: 64 00:06:34.255 # threads/core: 1 00:06:34.255 Run time: 1 seconds 00:06:34.255 Verify: Yes 00:06:34.255 00:06:34.255 Running for 1 seconds... 00:06:34.255 00:06:34.255 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:34.255 ------------------------------------------------------------------------------------ 00:06:34.255 0,0 965248/s 3770 MiB/s 0 0 00:06:34.255 ==================================================================================== 00:06:34.255 Total 965248/s 3770 MiB/s 0 0' 00:06:34.255 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.255 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.255 04:50:09 -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:34.255 04:50:09 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:34.255 04:50:09 -- accel/accel.sh@12 -- # build_accel_config 00:06:34.255 04:50:09 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:34.255 04:50:09 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:34.255 04:50:09 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:34.255 04:50:09 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:34.255 04:50:09 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:34.255 04:50:09 -- accel/accel.sh@41 -- # local IFS=, 00:06:34.255 04:50:09 -- accel/accel.sh@42 -- # jq -r . 00:06:34.514 [2024-11-08 04:50:09.375347] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:34.514 [2024-11-08 04:50:09.375434] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3668639 ] 00:06:34.514 EAL: No free 2048 kB hugepages reported on node 1 00:06:34.514 [2024-11-08 04:50:09.443783] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.514 [2024-11-08 04:50:09.513402] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=0x1 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=fill 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@24 -- # accel_opc=fill 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=0x80 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=software 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@23 -- # accel_module=software 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=64 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=64 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=1 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val=Yes 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:34.514 04:50:09 -- accel/accel.sh@21 -- # val= 00:06:34.514 04:50:09 -- accel/accel.sh@22 -- # case "$var" in 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # IFS=: 00:06:34.514 04:50:09 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@21 -- # val= 00:06:35.888 04:50:10 -- accel/accel.sh@22 -- # case "$var" in 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # IFS=: 00:06:35.888 04:50:10 -- accel/accel.sh@20 -- # read -r var val 00:06:35.888 04:50:10 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:35.888 04:50:10 -- accel/accel.sh@28 -- # [[ -n fill ]] 00:06:35.888 04:50:10 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:35.888 00:06:35.888 real 0m2.669s 00:06:35.888 user 0m2.421s 00:06:35.888 sys 0m0.256s 00:06:35.888 04:50:10 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:35.888 04:50:10 -- common/autotest_common.sh@10 -- # set +x 00:06:35.888 ************************************ 00:06:35.888 END TEST accel_fill 00:06:35.888 ************************************ 00:06:35.888 04:50:10 -- accel/accel.sh@97 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:35.888 04:50:10 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:35.888 04:50:10 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:35.888 04:50:10 -- common/autotest_common.sh@10 -- # set +x 00:06:35.888 ************************************ 00:06:35.888 START TEST accel_copy_crc32c 00:06:35.888 ************************************ 00:06:35.888 04:50:10 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y 00:06:35.888 04:50:10 -- accel/accel.sh@16 -- # local accel_opc 00:06:35.888 04:50:10 -- accel/accel.sh@17 -- # local accel_module 00:06:35.888 04:50:10 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:35.888 04:50:10 -- accel/accel.sh@12 -- # build_accel_config 00:06:35.888 04:50:10 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:35.888 04:50:10 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:35.888 04:50:10 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.888 04:50:10 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.888 04:50:10 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:35.888 04:50:10 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:35.888 04:50:10 -- accel/accel.sh@41 -- # local IFS=, 00:06:35.888 04:50:10 -- accel/accel.sh@42 -- # jq -r . 00:06:35.888 [2024-11-08 04:50:10.760256] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:35.888 [2024-11-08 04:50:10.760345] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3668920 ] 00:06:35.888 EAL: No free 2048 kB hugepages reported on node 1 00:06:35.888 [2024-11-08 04:50:10.830978] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:35.888 [2024-11-08 04:50:10.900855] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.263 04:50:12 -- accel/accel.sh@18 -- # out=' 00:06:37.263 SPDK Configuration: 00:06:37.263 Core mask: 0x1 00:06:37.263 00:06:37.263 Accel Perf Configuration: 00:06:37.263 Workload Type: copy_crc32c 00:06:37.263 CRC-32C seed: 0 00:06:37.263 Vector size: 4096 bytes 00:06:37.264 Transfer size: 4096 bytes 00:06:37.264 Vector count 1 00:06:37.264 Module: software 00:06:37.264 Queue depth: 32 00:06:37.264 Allocate depth: 32 00:06:37.264 # threads/core: 1 00:06:37.264 Run time: 1 seconds 00:06:37.264 Verify: Yes 00:06:37.264 00:06:37.264 Running for 1 seconds... 00:06:37.264 00:06:37.264 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:37.264 ------------------------------------------------------------------------------------ 00:06:37.264 0,0 433472/s 1693 MiB/s 0 0 00:06:37.264 ==================================================================================== 00:06:37.264 Total 433472/s 1693 MiB/s 0 0' 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:37.264 04:50:12 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:37.264 04:50:12 -- accel/accel.sh@12 -- # build_accel_config 00:06:37.264 04:50:12 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:37.264 04:50:12 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.264 04:50:12 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.264 04:50:12 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:37.264 04:50:12 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:37.264 04:50:12 -- accel/accel.sh@41 -- # local IFS=, 00:06:37.264 04:50:12 -- accel/accel.sh@42 -- # jq -r . 00:06:37.264 [2024-11-08 04:50:12.090834] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:37.264 [2024-11-08 04:50:12.090910] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3669194 ] 00:06:37.264 EAL: No free 2048 kB hugepages reported on node 1 00:06:37.264 [2024-11-08 04:50:12.158412] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.264 [2024-11-08 04:50:12.224331] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=0x1 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=0 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=software 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@23 -- # accel_module=software 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=32 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=32 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=1 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val=Yes 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:37.264 04:50:12 -- accel/accel.sh@21 -- # val= 00:06:37.264 04:50:12 -- accel/accel.sh@22 -- # case "$var" in 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # IFS=: 00:06:37.264 04:50:12 -- accel/accel.sh@20 -- # read -r var val 00:06:38.639 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.639 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.639 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.639 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.639 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.639 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.640 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.640 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.640 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.640 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.640 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.640 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.640 04:50:13 -- accel/accel.sh@21 -- # val= 00:06:38.640 04:50:13 -- accel/accel.sh@22 -- # case "$var" in 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # IFS=: 00:06:38.640 04:50:13 -- accel/accel.sh@20 -- # read -r var val 00:06:38.640 04:50:13 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:38.640 04:50:13 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:38.640 04:50:13 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:38.640 00:06:38.640 real 0m2.661s 00:06:38.640 user 0m2.402s 00:06:38.640 sys 0m0.268s 00:06:38.640 04:50:13 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:38.640 04:50:13 -- common/autotest_common.sh@10 -- # set +x 00:06:38.640 ************************************ 00:06:38.640 END TEST accel_copy_crc32c 00:06:38.640 ************************************ 00:06:38.640 04:50:13 -- accel/accel.sh@98 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:38.640 04:50:13 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:38.640 04:50:13 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:38.640 04:50:13 -- common/autotest_common.sh@10 -- # set +x 00:06:38.640 ************************************ 00:06:38.640 START TEST accel_copy_crc32c_C2 00:06:38.640 ************************************ 00:06:38.640 04:50:13 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:38.640 04:50:13 -- accel/accel.sh@16 -- # local accel_opc 00:06:38.640 04:50:13 -- accel/accel.sh@17 -- # local accel_module 00:06:38.640 04:50:13 -- accel/accel.sh@18 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:38.640 04:50:13 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:38.640 04:50:13 -- accel/accel.sh@12 -- # build_accel_config 00:06:38.640 04:50:13 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:38.640 04:50:13 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.640 04:50:13 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.640 04:50:13 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:38.640 04:50:13 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:38.640 04:50:13 -- accel/accel.sh@41 -- # local IFS=, 00:06:38.640 04:50:13 -- accel/accel.sh@42 -- # jq -r . 00:06:38.640 [2024-11-08 04:50:13.468659] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:38.640 [2024-11-08 04:50:13.468753] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3669475 ] 00:06:38.640 EAL: No free 2048 kB hugepages reported on node 1 00:06:38.640 [2024-11-08 04:50:13.537202] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.640 [2024-11-08 04:50:13.606556] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.016 04:50:14 -- accel/accel.sh@18 -- # out=' 00:06:40.016 SPDK Configuration: 00:06:40.016 Core mask: 0x1 00:06:40.016 00:06:40.016 Accel Perf Configuration: 00:06:40.016 Workload Type: copy_crc32c 00:06:40.016 CRC-32C seed: 0 00:06:40.016 Vector size: 4096 bytes 00:06:40.016 Transfer size: 8192 bytes 00:06:40.016 Vector count 2 00:06:40.016 Module: software 00:06:40.016 Queue depth: 32 00:06:40.016 Allocate depth: 32 00:06:40.016 # threads/core: 1 00:06:40.016 Run time: 1 seconds 00:06:40.016 Verify: Yes 00:06:40.016 00:06:40.016 Running for 1 seconds... 00:06:40.016 00:06:40.016 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:40.016 ------------------------------------------------------------------------------------ 00:06:40.016 0,0 297760/s 2326 MiB/s 0 0 00:06:40.016 ==================================================================================== 00:06:40.016 Total 297760/s 1163 MiB/s 0 0' 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:40.016 04:50:14 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:40.016 04:50:14 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.016 04:50:14 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:40.016 04:50:14 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.016 04:50:14 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.016 04:50:14 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:40.016 04:50:14 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:40.016 04:50:14 -- accel/accel.sh@41 -- # local IFS=, 00:06:40.016 04:50:14 -- accel/accel.sh@42 -- # jq -r . 00:06:40.016 [2024-11-08 04:50:14.796487] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:40.016 [2024-11-08 04:50:14.796584] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3669748 ] 00:06:40.016 EAL: No free 2048 kB hugepages reported on node 1 00:06:40.016 [2024-11-08 04:50:14.865061] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.016 [2024-11-08 04:50:14.930767] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=0x1 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=copy_crc32c 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@24 -- # accel_opc=copy_crc32c 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=0 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val='8192 bytes' 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=software 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@23 -- # accel_module=software 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=32 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=32 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=1 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val=Yes 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:40.016 04:50:14 -- accel/accel.sh@21 -- # val= 00:06:40.016 04:50:14 -- accel/accel.sh@22 -- # case "$var" in 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # IFS=: 00:06:40.016 04:50:14 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@21 -- # val= 00:06:41.391 04:50:16 -- accel/accel.sh@22 -- # case "$var" in 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # IFS=: 00:06:41.391 04:50:16 -- accel/accel.sh@20 -- # read -r var val 00:06:41.391 04:50:16 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:41.391 04:50:16 -- accel/accel.sh@28 -- # [[ -n copy_crc32c ]] 00:06:41.391 04:50:16 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.391 00:06:41.391 real 0m2.658s 00:06:41.391 user 0m2.413s 00:06:41.391 sys 0m0.254s 00:06:41.391 04:50:16 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:41.391 04:50:16 -- common/autotest_common.sh@10 -- # set +x 00:06:41.391 ************************************ 00:06:41.391 END TEST accel_copy_crc32c_C2 00:06:41.391 ************************************ 00:06:41.391 04:50:16 -- accel/accel.sh@99 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:41.391 04:50:16 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:41.391 04:50:16 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:41.391 04:50:16 -- common/autotest_common.sh@10 -- # set +x 00:06:41.391 ************************************ 00:06:41.391 START TEST accel_dualcast 00:06:41.391 ************************************ 00:06:41.391 04:50:16 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dualcast -y 00:06:41.391 04:50:16 -- accel/accel.sh@16 -- # local accel_opc 00:06:41.391 04:50:16 -- accel/accel.sh@17 -- # local accel_module 00:06:41.391 04:50:16 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dualcast -y 00:06:41.391 04:50:16 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:41.391 04:50:16 -- accel/accel.sh@12 -- # build_accel_config 00:06:41.391 04:50:16 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:41.391 04:50:16 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.391 04:50:16 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.391 04:50:16 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:41.391 04:50:16 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:41.391 04:50:16 -- accel/accel.sh@41 -- # local IFS=, 00:06:41.391 04:50:16 -- accel/accel.sh@42 -- # jq -r . 00:06:41.391 [2024-11-08 04:50:16.176836] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:41.391 [2024-11-08 04:50:16.176913] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3669975 ] 00:06:41.391 EAL: No free 2048 kB hugepages reported on node 1 00:06:41.391 [2024-11-08 04:50:16.244510] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:41.391 [2024-11-08 04:50:16.312167] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.766 04:50:17 -- accel/accel.sh@18 -- # out=' 00:06:42.766 SPDK Configuration: 00:06:42.766 Core mask: 0x1 00:06:42.766 00:06:42.766 Accel Perf Configuration: 00:06:42.766 Workload Type: dualcast 00:06:42.766 Transfer size: 4096 bytes 00:06:42.766 Vector count 1 00:06:42.766 Module: software 00:06:42.766 Queue depth: 32 00:06:42.766 Allocate depth: 32 00:06:42.766 # threads/core: 1 00:06:42.766 Run time: 1 seconds 00:06:42.766 Verify: Yes 00:06:42.766 00:06:42.766 Running for 1 seconds... 00:06:42.766 00:06:42.766 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:42.766 ------------------------------------------------------------------------------------ 00:06:42.766 0,0 633312/s 2473 MiB/s 0 0 00:06:42.766 ==================================================================================== 00:06:42.766 Total 633312/s 2473 MiB/s 0 0' 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:42.766 04:50:17 -- accel/accel.sh@12 -- # build_accel_config 00:06:42.766 04:50:17 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:42.766 04:50:17 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:42.766 04:50:17 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:42.766 04:50:17 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:42.766 04:50:17 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:42.766 04:50:17 -- accel/accel.sh@41 -- # local IFS=, 00:06:42.766 04:50:17 -- accel/accel.sh@42 -- # jq -r . 00:06:42.766 [2024-11-08 04:50:17.498146] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:42.766 [2024-11-08 04:50:17.498239] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3670137 ] 00:06:42.766 EAL: No free 2048 kB hugepages reported on node 1 00:06:42.766 [2024-11-08 04:50:17.566356] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.766 [2024-11-08 04:50:17.632491] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=0x1 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=dualcast 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@24 -- # accel_opc=dualcast 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=software 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@23 -- # accel_module=software 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=32 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=32 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=1 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val=Yes 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.766 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.766 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.766 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.767 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.767 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:42.767 04:50:17 -- accel/accel.sh@21 -- # val= 00:06:42.767 04:50:17 -- accel/accel.sh@22 -- # case "$var" in 00:06:42.767 04:50:17 -- accel/accel.sh@20 -- # IFS=: 00:06:42.767 04:50:17 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@21 -- # val= 00:06:43.702 04:50:18 -- accel/accel.sh@22 -- # case "$var" in 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # IFS=: 00:06:43.702 04:50:18 -- accel/accel.sh@20 -- # read -r var val 00:06:43.702 04:50:18 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:43.702 04:50:18 -- accel/accel.sh@28 -- # [[ -n dualcast ]] 00:06:43.702 04:50:18 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.702 00:06:43.702 real 0m2.650s 00:06:43.702 user 0m2.412s 00:06:43.702 sys 0m0.246s 00:06:43.702 04:50:18 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:43.702 04:50:18 -- common/autotest_common.sh@10 -- # set +x 00:06:43.702 ************************************ 00:06:43.702 END TEST accel_dualcast 00:06:43.702 ************************************ 00:06:43.961 04:50:18 -- accel/accel.sh@100 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:43.961 04:50:18 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:43.961 04:50:18 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:43.961 04:50:18 -- common/autotest_common.sh@10 -- # set +x 00:06:43.961 ************************************ 00:06:43.961 START TEST accel_compare 00:06:43.962 ************************************ 00:06:43.962 04:50:18 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compare -y 00:06:43.962 04:50:18 -- accel/accel.sh@16 -- # local accel_opc 00:06:43.962 04:50:18 -- accel/accel.sh@17 -- # local accel_module 00:06:43.962 04:50:18 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compare -y 00:06:43.962 04:50:18 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:43.962 04:50:18 -- accel/accel.sh@12 -- # build_accel_config 00:06:43.962 04:50:18 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:43.962 04:50:18 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.962 04:50:18 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.962 04:50:18 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:43.962 04:50:18 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:43.962 04:50:18 -- accel/accel.sh@41 -- # local IFS=, 00:06:43.962 04:50:18 -- accel/accel.sh@42 -- # jq -r . 00:06:43.962 [2024-11-08 04:50:18.874413] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:43.962 [2024-11-08 04:50:18.874508] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3670352 ] 00:06:43.962 EAL: No free 2048 kB hugepages reported on node 1 00:06:43.962 [2024-11-08 04:50:18.942268] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.962 [2024-11-08 04:50:19.010104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.338 04:50:20 -- accel/accel.sh@18 -- # out=' 00:06:45.338 SPDK Configuration: 00:06:45.338 Core mask: 0x1 00:06:45.338 00:06:45.338 Accel Perf Configuration: 00:06:45.338 Workload Type: compare 00:06:45.338 Transfer size: 4096 bytes 00:06:45.338 Vector count 1 00:06:45.338 Module: software 00:06:45.338 Queue depth: 32 00:06:45.338 Allocate depth: 32 00:06:45.338 # threads/core: 1 00:06:45.338 Run time: 1 seconds 00:06:45.338 Verify: Yes 00:06:45.338 00:06:45.338 Running for 1 seconds... 00:06:45.338 00:06:45.338 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:45.338 ------------------------------------------------------------------------------------ 00:06:45.338 0,0 818272/s 3196 MiB/s 0 0 00:06:45.338 ==================================================================================== 00:06:45.338 Total 818272/s 3196 MiB/s 0 0' 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:45.338 04:50:20 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:45.338 04:50:20 -- accel/accel.sh@12 -- # build_accel_config 00:06:45.338 04:50:20 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:45.338 04:50:20 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:45.338 04:50:20 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:45.338 04:50:20 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:45.338 04:50:20 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:45.338 04:50:20 -- accel/accel.sh@41 -- # local IFS=, 00:06:45.338 04:50:20 -- accel/accel.sh@42 -- # jq -r . 00:06:45.338 [2024-11-08 04:50:20.201518] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:45.338 [2024-11-08 04:50:20.201605] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3670608 ] 00:06:45.338 EAL: No free 2048 kB hugepages reported on node 1 00:06:45.338 [2024-11-08 04:50:20.269862] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:45.338 [2024-11-08 04:50:20.339879] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=0x1 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=compare 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@24 -- # accel_opc=compare 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=software 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@23 -- # accel_module=software 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=32 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=32 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=1 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val=Yes 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:45.338 04:50:20 -- accel/accel.sh@21 -- # val= 00:06:45.338 04:50:20 -- accel/accel.sh@22 -- # case "$var" in 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # IFS=: 00:06:45.338 04:50:20 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@21 -- # val= 00:06:46.713 04:50:21 -- accel/accel.sh@22 -- # case "$var" in 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # IFS=: 00:06:46.713 04:50:21 -- accel/accel.sh@20 -- # read -r var val 00:06:46.713 04:50:21 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:46.713 04:50:21 -- accel/accel.sh@28 -- # [[ -n compare ]] 00:06:46.713 04:50:21 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.713 00:06:46.713 real 0m2.660s 00:06:46.713 user 0m2.422s 00:06:46.713 sys 0m0.246s 00:06:46.713 04:50:21 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:46.713 04:50:21 -- common/autotest_common.sh@10 -- # set +x 00:06:46.713 ************************************ 00:06:46.713 END TEST accel_compare 00:06:46.713 ************************************ 00:06:46.713 04:50:21 -- accel/accel.sh@101 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:46.713 04:50:21 -- common/autotest_common.sh@1087 -- # '[' 7 -le 1 ']' 00:06:46.713 04:50:21 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:46.713 04:50:21 -- common/autotest_common.sh@10 -- # set +x 00:06:46.713 ************************************ 00:06:46.713 START TEST accel_xor 00:06:46.713 ************************************ 00:06:46.713 04:50:21 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y 00:06:46.713 04:50:21 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.713 04:50:21 -- accel/accel.sh@17 -- # local accel_module 00:06:46.713 04:50:21 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y 00:06:46.713 04:50:21 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:46.713 04:50:21 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.713 04:50:21 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:46.713 04:50:21 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.713 04:50:21 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.713 04:50:21 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:46.713 04:50:21 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:46.713 04:50:21 -- accel/accel.sh@41 -- # local IFS=, 00:06:46.713 04:50:21 -- accel/accel.sh@42 -- # jq -r . 00:06:46.713 [2024-11-08 04:50:21.569034] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:46.713 [2024-11-08 04:50:21.569082] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3670901 ] 00:06:46.714 EAL: No free 2048 kB hugepages reported on node 1 00:06:46.714 [2024-11-08 04:50:21.631779] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.714 [2024-11-08 04:50:21.699194] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.089 04:50:22 -- accel/accel.sh@18 -- # out=' 00:06:48.089 SPDK Configuration: 00:06:48.089 Core mask: 0x1 00:06:48.089 00:06:48.089 Accel Perf Configuration: 00:06:48.089 Workload Type: xor 00:06:48.089 Source buffers: 2 00:06:48.089 Transfer size: 4096 bytes 00:06:48.089 Vector count 1 00:06:48.089 Module: software 00:06:48.089 Queue depth: 32 00:06:48.089 Allocate depth: 32 00:06:48.089 # threads/core: 1 00:06:48.089 Run time: 1 seconds 00:06:48.090 Verify: Yes 00:06:48.090 00:06:48.090 Running for 1 seconds... 00:06:48.090 00:06:48.090 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:48.090 ------------------------------------------------------------------------------------ 00:06:48.090 0,0 709696/s 2772 MiB/s 0 0 00:06:48.090 ==================================================================================== 00:06:48.090 Total 709696/s 2772 MiB/s 0 0' 00:06:48.090 04:50:22 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:22 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:22 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:48.090 04:50:22 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:48.090 04:50:22 -- accel/accel.sh@12 -- # build_accel_config 00:06:48.090 04:50:22 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:48.090 04:50:22 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:48.090 04:50:22 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:48.090 04:50:22 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:48.090 04:50:22 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:48.090 04:50:22 -- accel/accel.sh@41 -- # local IFS=, 00:06:48.090 04:50:22 -- accel/accel.sh@42 -- # jq -r . 00:06:48.090 [2024-11-08 04:50:22.886889] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:48.090 [2024-11-08 04:50:22.886977] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3671168 ] 00:06:48.090 EAL: No free 2048 kB hugepages reported on node 1 00:06:48.090 [2024-11-08 04:50:22.955535] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:48.090 [2024-11-08 04:50:23.021647] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=0x1 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=xor 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=2 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=software 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@23 -- # accel_module=software 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=32 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=32 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=1 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val=Yes 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:48.090 04:50:23 -- accel/accel.sh@21 -- # val= 00:06:48.090 04:50:23 -- accel/accel.sh@22 -- # case "$var" in 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # IFS=: 00:06:48.090 04:50:23 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@21 -- # val= 00:06:49.466 04:50:24 -- accel/accel.sh@22 -- # case "$var" in 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # IFS=: 00:06:49.466 04:50:24 -- accel/accel.sh@20 -- # read -r var val 00:06:49.466 04:50:24 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:49.466 04:50:24 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:49.466 04:50:24 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.466 00:06:49.466 real 0m2.635s 00:06:49.466 user 0m2.397s 00:06:49.466 sys 0m0.245s 00:06:49.466 04:50:24 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:49.466 04:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:49.466 ************************************ 00:06:49.466 END TEST accel_xor 00:06:49.466 ************************************ 00:06:49.466 04:50:24 -- accel/accel.sh@102 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:49.466 04:50:24 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:06:49.466 04:50:24 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:49.466 04:50:24 -- common/autotest_common.sh@10 -- # set +x 00:06:49.466 ************************************ 00:06:49.466 START TEST accel_xor 00:06:49.466 ************************************ 00:06:49.466 04:50:24 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w xor -y -x 3 00:06:49.466 04:50:24 -- accel/accel.sh@16 -- # local accel_opc 00:06:49.466 04:50:24 -- accel/accel.sh@17 -- # local accel_module 00:06:49.466 04:50:24 -- accel/accel.sh@18 -- # accel_perf -t 1 -w xor -y -x 3 00:06:49.466 04:50:24 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:49.466 04:50:24 -- accel/accel.sh@12 -- # build_accel_config 00:06:49.466 04:50:24 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:49.466 04:50:24 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.466 04:50:24 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.466 04:50:24 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:49.466 04:50:24 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:49.466 04:50:24 -- accel/accel.sh@41 -- # local IFS=, 00:06:49.466 04:50:24 -- accel/accel.sh@42 -- # jq -r . 00:06:49.466 [2024-11-08 04:50:24.263403] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:49.466 [2024-11-08 04:50:24.263497] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3671452 ] 00:06:49.466 EAL: No free 2048 kB hugepages reported on node 1 00:06:49.466 [2024-11-08 04:50:24.330359] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.466 [2024-11-08 04:50:24.397505] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.841 04:50:25 -- accel/accel.sh@18 -- # out=' 00:06:50.841 SPDK Configuration: 00:06:50.841 Core mask: 0x1 00:06:50.841 00:06:50.841 Accel Perf Configuration: 00:06:50.841 Workload Type: xor 00:06:50.841 Source buffers: 3 00:06:50.841 Transfer size: 4096 bytes 00:06:50.841 Vector count 1 00:06:50.841 Module: software 00:06:50.841 Queue depth: 32 00:06:50.841 Allocate depth: 32 00:06:50.841 # threads/core: 1 00:06:50.841 Run time: 1 seconds 00:06:50.841 Verify: Yes 00:06:50.841 00:06:50.841 Running for 1 seconds... 00:06:50.841 00:06:50.841 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:50.842 ------------------------------------------------------------------------------------ 00:06:50.842 0,0 672480/s 2626 MiB/s 0 0 00:06:50.842 ==================================================================================== 00:06:50.842 Total 672480/s 2626 MiB/s 0 0' 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:50.842 04:50:25 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:50.842 04:50:25 -- accel/accel.sh@12 -- # build_accel_config 00:06:50.842 04:50:25 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:50.842 04:50:25 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.842 04:50:25 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.842 04:50:25 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:50.842 04:50:25 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:50.842 04:50:25 -- accel/accel.sh@41 -- # local IFS=, 00:06:50.842 04:50:25 -- accel/accel.sh@42 -- # jq -r . 00:06:50.842 [2024-11-08 04:50:25.587955] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:50.842 [2024-11-08 04:50:25.588049] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3671724 ] 00:06:50.842 EAL: No free 2048 kB hugepages reported on node 1 00:06:50.842 [2024-11-08 04:50:25.656592] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.842 [2024-11-08 04:50:25.719903] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=0x1 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=xor 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@24 -- # accel_opc=xor 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=3 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=software 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@23 -- # accel_module=software 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=32 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=32 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=1 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val=Yes 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:50.842 04:50:25 -- accel/accel.sh@21 -- # val= 00:06:50.842 04:50:25 -- accel/accel.sh@22 -- # case "$var" in 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # IFS=: 00:06:50.842 04:50:25 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@21 -- # val= 00:06:52.218 04:50:26 -- accel/accel.sh@22 -- # case "$var" in 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # IFS=: 00:06:52.218 04:50:26 -- accel/accel.sh@20 -- # read -r var val 00:06:52.218 04:50:26 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:52.218 04:50:26 -- accel/accel.sh@28 -- # [[ -n xor ]] 00:06:52.218 04:50:26 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.218 00:06:52.218 real 0m2.652s 00:06:52.218 user 0m2.423s 00:06:52.218 sys 0m0.237s 00:06:52.218 04:50:26 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:52.218 04:50:26 -- common/autotest_common.sh@10 -- # set +x 00:06:52.218 ************************************ 00:06:52.218 END TEST accel_xor 00:06:52.218 ************************************ 00:06:52.218 04:50:26 -- accel/accel.sh@103 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:52.218 04:50:26 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:52.218 04:50:26 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:52.218 04:50:26 -- common/autotest_common.sh@10 -- # set +x 00:06:52.218 ************************************ 00:06:52.218 START TEST accel_dif_verify 00:06:52.218 ************************************ 00:06:52.218 04:50:26 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_verify 00:06:52.218 04:50:26 -- accel/accel.sh@16 -- # local accel_opc 00:06:52.218 04:50:26 -- accel/accel.sh@17 -- # local accel_module 00:06:52.218 04:50:26 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_verify 00:06:52.218 04:50:26 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:52.218 04:50:26 -- accel/accel.sh@12 -- # build_accel_config 00:06:52.218 04:50:26 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:52.218 04:50:26 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.218 04:50:26 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.218 04:50:26 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:52.218 04:50:26 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:52.218 04:50:26 -- accel/accel.sh@41 -- # local IFS=, 00:06:52.218 04:50:26 -- accel/accel.sh@42 -- # jq -r . 00:06:52.218 [2024-11-08 04:50:26.966016] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:52.218 [2024-11-08 04:50:26.966090] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3671942 ] 00:06:52.218 EAL: No free 2048 kB hugepages reported on node 1 00:06:52.218 [2024-11-08 04:50:27.033952] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.218 [2024-11-08 04:50:27.101964] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.594 04:50:28 -- accel/accel.sh@18 -- # out=' 00:06:53.594 SPDK Configuration: 00:06:53.594 Core mask: 0x1 00:06:53.594 00:06:53.594 Accel Perf Configuration: 00:06:53.594 Workload Type: dif_verify 00:06:53.594 Vector size: 4096 bytes 00:06:53.594 Transfer size: 4096 bytes 00:06:53.594 Block size: 512 bytes 00:06:53.594 Metadata size: 8 bytes 00:06:53.594 Vector count 1 00:06:53.594 Module: software 00:06:53.594 Queue depth: 32 00:06:53.594 Allocate depth: 32 00:06:53.594 # threads/core: 1 00:06:53.594 Run time: 1 seconds 00:06:53.594 Verify: No 00:06:53.594 00:06:53.594 Running for 1 seconds... 00:06:53.594 00:06:53.594 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:53.594 ------------------------------------------------------------------------------------ 00:06:53.594 0,0 248800/s 987 MiB/s 0 0 00:06:53.594 ==================================================================================== 00:06:53.594 Total 248800/s 971 MiB/s 0 0' 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:53.594 04:50:28 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:53.594 04:50:28 -- accel/accel.sh@12 -- # build_accel_config 00:06:53.594 04:50:28 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:53.594 04:50:28 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.594 04:50:28 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.594 04:50:28 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:53.594 04:50:28 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:53.594 04:50:28 -- accel/accel.sh@41 -- # local IFS=, 00:06:53.594 04:50:28 -- accel/accel.sh@42 -- # jq -r . 00:06:53.594 [2024-11-08 04:50:28.293988] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:53.594 [2024-11-08 04:50:28.294077] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3672102 ] 00:06:53.594 EAL: No free 2048 kB hugepages reported on node 1 00:06:53.594 [2024-11-08 04:50:28.364606] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.594 [2024-11-08 04:50:28.431139] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val=0x1 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val=dif_verify 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@24 -- # accel_opc=dif_verify 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.594 04:50:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.594 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.594 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val=software 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@23 -- # accel_module=software 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val=32 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val=32 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val=1 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val=No 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:53.595 04:50:28 -- accel/accel.sh@21 -- # val= 00:06:53.595 04:50:28 -- accel/accel.sh@22 -- # case "$var" in 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # IFS=: 00:06:53.595 04:50:28 -- accel/accel.sh@20 -- # read -r var val 00:06:54.530 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.530 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.530 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.530 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.530 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.530 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.530 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.530 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.530 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.531 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.531 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.531 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.531 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.531 04:50:29 -- accel/accel.sh@21 -- # val= 00:06:54.531 04:50:29 -- accel/accel.sh@22 -- # case "$var" in 00:06:54.531 04:50:29 -- accel/accel.sh@20 -- # IFS=: 00:06:54.531 04:50:29 -- accel/accel.sh@20 -- # read -r var val 00:06:54.531 04:50:29 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:54.531 04:50:29 -- accel/accel.sh@28 -- # [[ -n dif_verify ]] 00:06:54.531 04:50:29 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:54.531 00:06:54.531 real 0m2.660s 00:06:54.531 user 0m2.409s 00:06:54.531 sys 0m0.261s 00:06:54.531 04:50:29 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:54.531 04:50:29 -- common/autotest_common.sh@10 -- # set +x 00:06:54.531 ************************************ 00:06:54.531 END TEST accel_dif_verify 00:06:54.531 ************************************ 00:06:54.788 04:50:29 -- accel/accel.sh@104 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:54.788 04:50:29 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:54.788 04:50:29 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:54.788 04:50:29 -- common/autotest_common.sh@10 -- # set +x 00:06:54.788 ************************************ 00:06:54.788 START TEST accel_dif_generate 00:06:54.788 ************************************ 00:06:54.788 04:50:29 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate 00:06:54.788 04:50:29 -- accel/accel.sh@16 -- # local accel_opc 00:06:54.788 04:50:29 -- accel/accel.sh@17 -- # local accel_module 00:06:54.788 04:50:29 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate 00:06:54.788 04:50:29 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:54.788 04:50:29 -- accel/accel.sh@12 -- # build_accel_config 00:06:54.788 04:50:29 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:54.788 04:50:29 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:54.788 04:50:29 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:54.788 04:50:29 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:54.788 04:50:29 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:54.788 04:50:29 -- accel/accel.sh@41 -- # local IFS=, 00:06:54.788 04:50:29 -- accel/accel.sh@42 -- # jq -r . 00:06:54.788 [2024-11-08 04:50:29.673296] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:54.788 [2024-11-08 04:50:29.673388] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3672326 ] 00:06:54.788 EAL: No free 2048 kB hugepages reported on node 1 00:06:54.788 [2024-11-08 04:50:29.742704] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.788 [2024-11-08 04:50:29.810717] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.163 04:50:30 -- accel/accel.sh@18 -- # out=' 00:06:56.163 SPDK Configuration: 00:06:56.163 Core mask: 0x1 00:06:56.163 00:06:56.163 Accel Perf Configuration: 00:06:56.163 Workload Type: dif_generate 00:06:56.163 Vector size: 4096 bytes 00:06:56.163 Transfer size: 4096 bytes 00:06:56.163 Block size: 512 bytes 00:06:56.163 Metadata size: 8 bytes 00:06:56.163 Vector count 1 00:06:56.163 Module: software 00:06:56.163 Queue depth: 32 00:06:56.163 Allocate depth: 32 00:06:56.163 # threads/core: 1 00:06:56.163 Run time: 1 seconds 00:06:56.163 Verify: No 00:06:56.163 00:06:56.163 Running for 1 seconds... 00:06:56.163 00:06:56.163 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:56.163 ------------------------------------------------------------------------------------ 00:06:56.163 0,0 277312/s 1100 MiB/s 0 0 00:06:56.163 ==================================================================================== 00:06:56.163 Total 277312/s 1083 MiB/s 0 0' 00:06:56.163 04:50:30 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:30 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:30 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:56.163 04:50:30 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:56.163 04:50:30 -- accel/accel.sh@12 -- # build_accel_config 00:06:56.163 04:50:30 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:56.163 04:50:30 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.163 04:50:30 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.163 04:50:30 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:56.163 04:50:30 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:56.163 04:50:30 -- accel/accel.sh@41 -- # local IFS=, 00:06:56.163 04:50:30 -- accel/accel.sh@42 -- # jq -r . 00:06:56.163 [2024-11-08 04:50:30.998942] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:56.163 [2024-11-08 04:50:30.999038] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3672586 ] 00:06:56.163 EAL: No free 2048 kB hugepages reported on node 1 00:06:56.163 [2024-11-08 04:50:31.065793] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.163 [2024-11-08 04:50:31.131641] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=0x1 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=dif_generate 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@24 -- # accel_opc=dif_generate 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val='512 bytes' 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val='8 bytes' 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=software 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@23 -- # accel_module=software 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=32 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=32 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=1 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val=No 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:56.163 04:50:31 -- accel/accel.sh@21 -- # val= 00:06:56.163 04:50:31 -- accel/accel.sh@22 -- # case "$var" in 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # IFS=: 00:06:56.163 04:50:31 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@21 -- # val= 00:06:57.539 04:50:32 -- accel/accel.sh@22 -- # case "$var" in 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # IFS=: 00:06:57.539 04:50:32 -- accel/accel.sh@20 -- # read -r var val 00:06:57.539 04:50:32 -- accel/accel.sh@28 -- # [[ -n software ]] 00:06:57.539 04:50:32 -- accel/accel.sh@28 -- # [[ -n dif_generate ]] 00:06:57.539 04:50:32 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:57.539 00:06:57.539 real 0m2.654s 00:06:57.539 user 0m2.419s 00:06:57.539 sys 0m0.242s 00:06:57.539 04:50:32 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:06:57.539 04:50:32 -- common/autotest_common.sh@10 -- # set +x 00:06:57.539 ************************************ 00:06:57.539 END TEST accel_dif_generate 00:06:57.539 ************************************ 00:06:57.539 04:50:32 -- accel/accel.sh@105 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:57.539 04:50:32 -- common/autotest_common.sh@1087 -- # '[' 6 -le 1 ']' 00:06:57.539 04:50:32 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:06:57.539 04:50:32 -- common/autotest_common.sh@10 -- # set +x 00:06:57.539 ************************************ 00:06:57.539 START TEST accel_dif_generate_copy 00:06:57.539 ************************************ 00:06:57.539 04:50:32 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w dif_generate_copy 00:06:57.539 04:50:32 -- accel/accel.sh@16 -- # local accel_opc 00:06:57.539 04:50:32 -- accel/accel.sh@17 -- # local accel_module 00:06:57.539 04:50:32 -- accel/accel.sh@18 -- # accel_perf -t 1 -w dif_generate_copy 00:06:57.539 04:50:32 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:57.539 04:50:32 -- accel/accel.sh@12 -- # build_accel_config 00:06:57.539 04:50:32 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:57.540 04:50:32 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:57.540 04:50:32 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:57.540 04:50:32 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:57.540 04:50:32 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:57.540 04:50:32 -- accel/accel.sh@41 -- # local IFS=, 00:06:57.540 04:50:32 -- accel/accel.sh@42 -- # jq -r . 00:06:57.540 [2024-11-08 04:50:32.375560] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:57.540 [2024-11-08 04:50:32.375654] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3672867 ] 00:06:57.540 EAL: No free 2048 kB hugepages reported on node 1 00:06:57.540 [2024-11-08 04:50:32.444033] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.540 [2024-11-08 04:50:32.511546] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.915 04:50:33 -- accel/accel.sh@18 -- # out=' 00:06:58.915 SPDK Configuration: 00:06:58.915 Core mask: 0x1 00:06:58.915 00:06:58.915 Accel Perf Configuration: 00:06:58.915 Workload Type: dif_generate_copy 00:06:58.915 Vector size: 4096 bytes 00:06:58.915 Transfer size: 4096 bytes 00:06:58.915 Vector count 1 00:06:58.915 Module: software 00:06:58.915 Queue depth: 32 00:06:58.915 Allocate depth: 32 00:06:58.915 # threads/core: 1 00:06:58.915 Run time: 1 seconds 00:06:58.915 Verify: No 00:06:58.915 00:06:58.915 Running for 1 seconds... 00:06:58.915 00:06:58.915 Core,Thread Transfers Bandwidth Failed Miscompares 00:06:58.915 ------------------------------------------------------------------------------------ 00:06:58.915 0,0 222400/s 882 MiB/s 0 0 00:06:58.915 ==================================================================================== 00:06:58.915 Total 222400/s 868 MiB/s 0 0' 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:58.915 04:50:33 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:58.915 04:50:33 -- accel/accel.sh@12 -- # build_accel_config 00:06:58.915 04:50:33 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:06:58.915 04:50:33 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.915 04:50:33 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.915 04:50:33 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:06:58.915 04:50:33 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:06:58.915 04:50:33 -- accel/accel.sh@41 -- # local IFS=, 00:06:58.915 04:50:33 -- accel/accel.sh@42 -- # jq -r . 00:06:58.915 [2024-11-08 04:50:33.700447] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:06:58.915 [2024-11-08 04:50:33.700548] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3673141 ] 00:06:58.915 EAL: No free 2048 kB hugepages reported on node 1 00:06:58.915 [2024-11-08 04:50:33.767308] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.915 [2024-11-08 04:50:33.832857] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val=0x1 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val=dif_generate_copy 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@24 -- # accel_opc=dif_generate_copy 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.915 04:50:33 -- accel/accel.sh@21 -- # val='4096 bytes' 00:06:58.915 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.915 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val=software 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@23 -- # accel_module=software 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val=32 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val=32 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val=1 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val='1 seconds' 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val=No 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:06:58.916 04:50:33 -- accel/accel.sh@21 -- # val= 00:06:58.916 04:50:33 -- accel/accel.sh@22 -- # case "$var" in 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # IFS=: 00:06:58.916 04:50:33 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:34 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:34 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:34 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:34 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:34 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:34 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:35 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:35 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:35 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:35 -- accel/accel.sh@21 -- # val= 00:07:00.291 04:50:35 -- accel/accel.sh@22 -- # case "$var" in 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # IFS=: 00:07:00.291 04:50:35 -- accel/accel.sh@20 -- # read -r var val 00:07:00.291 04:50:35 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:00.291 04:50:35 -- accel/accel.sh@28 -- # [[ -n dif_generate_copy ]] 00:07:00.291 04:50:35 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:00.291 00:07:00.291 real 0m2.653s 00:07:00.291 user 0m2.406s 00:07:00.291 sys 0m0.255s 00:07:00.292 04:50:35 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:00.292 04:50:35 -- common/autotest_common.sh@10 -- # set +x 00:07:00.292 ************************************ 00:07:00.292 END TEST accel_dif_generate_copy 00:07:00.292 ************************************ 00:07:00.292 04:50:35 -- accel/accel.sh@107 -- # [[ y == y ]] 00:07:00.292 04:50:35 -- accel/accel.sh@108 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.292 04:50:35 -- common/autotest_common.sh@1087 -- # '[' 8 -le 1 ']' 00:07:00.292 04:50:35 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:00.292 04:50:35 -- common/autotest_common.sh@10 -- # set +x 00:07:00.292 ************************************ 00:07:00.292 START TEST accel_comp 00:07:00.292 ************************************ 00:07:00.292 04:50:35 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.292 04:50:35 -- accel/accel.sh@16 -- # local accel_opc 00:07:00.292 04:50:35 -- accel/accel.sh@17 -- # local accel_module 00:07:00.292 04:50:35 -- accel/accel.sh@18 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.292 04:50:35 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:00.292 04:50:35 -- accel/accel.sh@12 -- # build_accel_config 00:07:00.292 04:50:35 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:00.292 04:50:35 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:00.292 04:50:35 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:00.292 04:50:35 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:00.292 04:50:35 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:00.292 04:50:35 -- accel/accel.sh@41 -- # local IFS=, 00:07:00.292 04:50:35 -- accel/accel.sh@42 -- # jq -r . 00:07:00.292 [2024-11-08 04:50:35.079215] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:00.292 [2024-11-08 04:50:35.079309] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3673422 ] 00:07:00.292 EAL: No free 2048 kB hugepages reported on node 1 00:07:00.292 [2024-11-08 04:50:35.149035] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:00.292 [2024-11-08 04:50:35.215357] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.668 04:50:36 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:01.668 00:07:01.668 SPDK Configuration: 00:07:01.668 Core mask: 0x1 00:07:01.668 00:07:01.668 Accel Perf Configuration: 00:07:01.668 Workload Type: compress 00:07:01.668 Transfer size: 4096 bytes 00:07:01.668 Vector count 1 00:07:01.668 Module: software 00:07:01.668 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:01.668 Queue depth: 32 00:07:01.668 Allocate depth: 32 00:07:01.668 # threads/core: 1 00:07:01.668 Run time: 1 seconds 00:07:01.668 Verify: No 00:07:01.668 00:07:01.668 Running for 1 seconds... 00:07:01.668 00:07:01.668 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:01.668 ------------------------------------------------------------------------------------ 00:07:01.668 0,0 68032/s 283 MiB/s 0 0 00:07:01.668 ==================================================================================== 00:07:01.668 Total 68032/s 265 MiB/s 0 0' 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:01.668 04:50:36 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:01.668 04:50:36 -- accel/accel.sh@12 -- # build_accel_config 00:07:01.668 04:50:36 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:01.668 04:50:36 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.668 04:50:36 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.668 04:50:36 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:01.668 04:50:36 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:01.668 04:50:36 -- accel/accel.sh@41 -- # local IFS=, 00:07:01.668 04:50:36 -- accel/accel.sh@42 -- # jq -r . 00:07:01.668 [2024-11-08 04:50:36.408805] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:01.668 [2024-11-08 04:50:36.408893] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3673694 ] 00:07:01.668 EAL: No free 2048 kB hugepages reported on node 1 00:07:01.668 [2024-11-08 04:50:36.477625] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.668 [2024-11-08 04:50:36.543281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=0x1 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=compress 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@24 -- # accel_opc=compress 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=software 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@23 -- # accel_module=software 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=32 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=32 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val=1 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.668 04:50:36 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:01.668 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.668 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.669 04:50:36 -- accel/accel.sh@21 -- # val=No 00:07:01.669 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.669 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.669 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:01.669 04:50:36 -- accel/accel.sh@21 -- # val= 00:07:01.669 04:50:36 -- accel/accel.sh@22 -- # case "$var" in 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # IFS=: 00:07:01.669 04:50:36 -- accel/accel.sh@20 -- # read -r var val 00:07:02.604 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@21 -- # val= 00:07:02.909 04:50:37 -- accel/accel.sh@22 -- # case "$var" in 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # IFS=: 00:07:02.909 04:50:37 -- accel/accel.sh@20 -- # read -r var val 00:07:02.909 04:50:37 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:02.909 04:50:37 -- accel/accel.sh@28 -- # [[ -n compress ]] 00:07:02.909 04:50:37 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.909 00:07:02.909 real 0m2.665s 00:07:02.909 user 0m2.411s 00:07:02.909 sys 0m0.262s 00:07:02.909 04:50:37 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:02.909 04:50:37 -- common/autotest_common.sh@10 -- # set +x 00:07:02.909 ************************************ 00:07:02.909 END TEST accel_comp 00:07:02.909 ************************************ 00:07:02.909 04:50:37 -- accel/accel.sh@109 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.909 04:50:37 -- common/autotest_common.sh@1087 -- # '[' 9 -le 1 ']' 00:07:02.909 04:50:37 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:02.909 04:50:37 -- common/autotest_common.sh@10 -- # set +x 00:07:02.909 ************************************ 00:07:02.909 START TEST accel_decomp 00:07:02.909 ************************************ 00:07:02.909 04:50:37 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.909 04:50:37 -- accel/accel.sh@16 -- # local accel_opc 00:07:02.909 04:50:37 -- accel/accel.sh@17 -- # local accel_module 00:07:02.909 04:50:37 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.909 04:50:37 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:02.909 04:50:37 -- accel/accel.sh@12 -- # build_accel_config 00:07:02.909 04:50:37 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:02.909 04:50:37 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.909 04:50:37 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.909 04:50:37 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:02.909 04:50:37 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:02.909 04:50:37 -- accel/accel.sh@41 -- # local IFS=, 00:07:02.909 04:50:37 -- accel/accel.sh@42 -- # jq -r . 00:07:02.909 [2024-11-08 04:50:37.792582] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:02.909 [2024-11-08 04:50:37.792676] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3673933 ] 00:07:02.909 EAL: No free 2048 kB hugepages reported on node 1 00:07:02.909 [2024-11-08 04:50:37.862942] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:02.909 [2024-11-08 04:50:37.931170] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.295 04:50:39 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:04.295 00:07:04.295 SPDK Configuration: 00:07:04.295 Core mask: 0x1 00:07:04.295 00:07:04.295 Accel Perf Configuration: 00:07:04.295 Workload Type: decompress 00:07:04.295 Transfer size: 4096 bytes 00:07:04.295 Vector count 1 00:07:04.295 Module: software 00:07:04.295 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:04.295 Queue depth: 32 00:07:04.295 Allocate depth: 32 00:07:04.295 # threads/core: 1 00:07:04.295 Run time: 1 seconds 00:07:04.295 Verify: Yes 00:07:04.295 00:07:04.295 Running for 1 seconds... 00:07:04.295 00:07:04.295 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:04.295 ------------------------------------------------------------------------------------ 00:07:04.296 0,0 91712/s 169 MiB/s 0 0 00:07:04.296 ==================================================================================== 00:07:04.296 Total 91712/s 358 MiB/s 0 0' 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:04.296 04:50:39 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y 00:07:04.296 04:50:39 -- accel/accel.sh@12 -- # build_accel_config 00:07:04.296 04:50:39 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:04.296 04:50:39 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.296 04:50:39 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.296 04:50:39 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:04.296 04:50:39 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:04.296 04:50:39 -- accel/accel.sh@41 -- # local IFS=, 00:07:04.296 04:50:39 -- accel/accel.sh@42 -- # jq -r . 00:07:04.296 [2024-11-08 04:50:39.122378] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:04.296 [2024-11-08 04:50:39.122467] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674103 ] 00:07:04.296 EAL: No free 2048 kB hugepages reported on node 1 00:07:04.296 [2024-11-08 04:50:39.190753] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:04.296 [2024-11-08 04:50:39.258111] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=0x1 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=decompress 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=software 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@23 -- # accel_module=software 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=32 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=32 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=1 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val=Yes 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:04.296 04:50:39 -- accel/accel.sh@21 -- # val= 00:07:04.296 04:50:39 -- accel/accel.sh@22 -- # case "$var" in 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # IFS=: 00:07:04.296 04:50:39 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@21 -- # val= 00:07:05.672 04:50:40 -- accel/accel.sh@22 -- # case "$var" in 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # IFS=: 00:07:05.672 04:50:40 -- accel/accel.sh@20 -- # read -r var val 00:07:05.672 04:50:40 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:05.673 04:50:40 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:05.673 04:50:40 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.673 00:07:05.673 real 0m2.668s 00:07:05.673 user 0m2.423s 00:07:05.673 sys 0m0.256s 00:07:05.673 04:50:40 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:05.673 04:50:40 -- common/autotest_common.sh@10 -- # set +x 00:07:05.673 ************************************ 00:07:05.673 END TEST accel_decomp 00:07:05.673 ************************************ 00:07:05.673 04:50:40 -- accel/accel.sh@110 -- # run_test accel_decmop_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.673 04:50:40 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:05.673 04:50:40 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:05.673 04:50:40 -- common/autotest_common.sh@10 -- # set +x 00:07:05.673 ************************************ 00:07:05.673 START TEST accel_decmop_full 00:07:05.673 ************************************ 00:07:05.673 04:50:40 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.673 04:50:40 -- accel/accel.sh@16 -- # local accel_opc 00:07:05.673 04:50:40 -- accel/accel.sh@17 -- # local accel_module 00:07:05.673 04:50:40 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.673 04:50:40 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:05.673 04:50:40 -- accel/accel.sh@12 -- # build_accel_config 00:07:05.673 04:50:40 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:05.673 04:50:40 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.673 04:50:40 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.673 04:50:40 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:05.673 04:50:40 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:05.673 04:50:40 -- accel/accel.sh@41 -- # local IFS=, 00:07:05.673 04:50:40 -- accel/accel.sh@42 -- # jq -r . 00:07:05.673 [2024-11-08 04:50:40.506390] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:05.673 [2024-11-08 04:50:40.506478] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674310 ] 00:07:05.673 EAL: No free 2048 kB hugepages reported on node 1 00:07:05.673 [2024-11-08 04:50:40.577551] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.673 [2024-11-08 04:50:40.647795] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.049 04:50:41 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:07.049 00:07:07.049 SPDK Configuration: 00:07:07.049 Core mask: 0x1 00:07:07.049 00:07:07.049 Accel Perf Configuration: 00:07:07.049 Workload Type: decompress 00:07:07.049 Transfer size: 111250 bytes 00:07:07.049 Vector count 1 00:07:07.049 Module: software 00:07:07.049 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.049 Queue depth: 32 00:07:07.049 Allocate depth: 32 00:07:07.049 # threads/core: 1 00:07:07.049 Run time: 1 seconds 00:07:07.049 Verify: Yes 00:07:07.049 00:07:07.049 Running for 1 seconds... 00:07:07.049 00:07:07.049 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:07.049 ------------------------------------------------------------------------------------ 00:07:07.049 0,0 5888/s 243 MiB/s 0 0 00:07:07.049 ==================================================================================== 00:07:07.049 Total 5888/s 624 MiB/s 0 0' 00:07:07.049 04:50:41 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:41 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:41 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.049 04:50:41 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:07.049 04:50:41 -- accel/accel.sh@12 -- # build_accel_config 00:07:07.049 04:50:41 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:07.049 04:50:41 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.049 04:50:41 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.049 04:50:41 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:07.049 04:50:41 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:07.049 04:50:41 -- accel/accel.sh@41 -- # local IFS=, 00:07:07.049 04:50:41 -- accel/accel.sh@42 -- # jq -r . 00:07:07.049 [2024-11-08 04:50:41.848928] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:07.049 [2024-11-08 04:50:41.849007] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674559 ] 00:07:07.049 EAL: No free 2048 kB hugepages reported on node 1 00:07:07.049 [2024-11-08 04:50:41.916904] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.049 [2024-11-08 04:50:41.983008] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=0x1 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=decompress 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=software 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@23 -- # accel_module=software 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=32 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=32 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=1 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.049 04:50:42 -- accel/accel.sh@21 -- # val=Yes 00:07:07.049 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.049 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.050 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.050 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.050 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.050 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.050 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:07.050 04:50:42 -- accel/accel.sh@21 -- # val= 00:07:07.050 04:50:42 -- accel/accel.sh@22 -- # case "$var" in 00:07:07.050 04:50:42 -- accel/accel.sh@20 -- # IFS=: 00:07:07.050 04:50:42 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@21 -- # val= 00:07:08.425 04:50:43 -- accel/accel.sh@22 -- # case "$var" in 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # IFS=: 00:07:08.425 04:50:43 -- accel/accel.sh@20 -- # read -r var val 00:07:08.425 04:50:43 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:08.425 04:50:43 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:08.425 04:50:43 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.425 00:07:08.425 real 0m2.680s 00:07:08.425 user 0m2.431s 00:07:08.425 sys 0m0.257s 00:07:08.425 04:50:43 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:08.425 04:50:43 -- common/autotest_common.sh@10 -- # set +x 00:07:08.425 ************************************ 00:07:08.425 END TEST accel_decmop_full 00:07:08.425 ************************************ 00:07:08.425 04:50:43 -- accel/accel.sh@111 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.425 04:50:43 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:08.425 04:50:43 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:08.425 04:50:43 -- common/autotest_common.sh@10 -- # set +x 00:07:08.425 ************************************ 00:07:08.425 START TEST accel_decomp_mcore 00:07:08.425 ************************************ 00:07:08.425 04:50:43 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.425 04:50:43 -- accel/accel.sh@16 -- # local accel_opc 00:07:08.425 04:50:43 -- accel/accel.sh@17 -- # local accel_module 00:07:08.425 04:50:43 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.425 04:50:43 -- accel/accel.sh@12 -- # build_accel_config 00:07:08.425 04:50:43 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:08.425 04:50:43 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:08.425 04:50:43 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.425 04:50:43 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.425 04:50:43 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:08.425 04:50:43 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:08.425 04:50:43 -- accel/accel.sh@41 -- # local IFS=, 00:07:08.425 04:50:43 -- accel/accel.sh@42 -- # jq -r . 00:07:08.425 [2024-11-08 04:50:43.230823] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:08.425 [2024-11-08 04:50:43.230911] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3674848 ] 00:07:08.425 EAL: No free 2048 kB hugepages reported on node 1 00:07:08.425 [2024-11-08 04:50:43.298999] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:08.425 [2024-11-08 04:50:43.369325] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:08.425 [2024-11-08 04:50:43.369422] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:08.425 [2024-11-08 04:50:43.369494] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:08.425 [2024-11-08 04:50:43.369496] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.801 04:50:44 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:09.801 00:07:09.801 SPDK Configuration: 00:07:09.801 Core mask: 0xf 00:07:09.801 00:07:09.801 Accel Perf Configuration: 00:07:09.801 Workload Type: decompress 00:07:09.801 Transfer size: 4096 bytes 00:07:09.801 Vector count 1 00:07:09.801 Module: software 00:07:09.801 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:09.801 Queue depth: 32 00:07:09.801 Allocate depth: 32 00:07:09.801 # threads/core: 1 00:07:09.801 Run time: 1 seconds 00:07:09.801 Verify: Yes 00:07:09.801 00:07:09.801 Running for 1 seconds... 00:07:09.801 00:07:09.801 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:09.801 ------------------------------------------------------------------------------------ 00:07:09.801 0,0 75200/s 138 MiB/s 0 0 00:07:09.801 3,0 75648/s 139 MiB/s 0 0 00:07:09.801 2,0 75872/s 139 MiB/s 0 0 00:07:09.801 1,0 75808/s 139 MiB/s 0 0 00:07:09.801 ==================================================================================== 00:07:09.801 Total 302528/s 1181 MiB/s 0 0' 00:07:09.801 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.801 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.801 04:50:44 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.801 04:50:44 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:09.801 04:50:44 -- accel/accel.sh@12 -- # build_accel_config 00:07:09.801 04:50:44 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:09.801 04:50:44 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:09.801 04:50:44 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:09.801 04:50:44 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:09.801 04:50:44 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:09.801 04:50:44 -- accel/accel.sh@41 -- # local IFS=, 00:07:09.802 04:50:44 -- accel/accel.sh@42 -- # jq -r . 00:07:09.802 [2024-11-08 04:50:44.566826] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:09.802 [2024-11-08 04:50:44.566915] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3675122 ] 00:07:09.802 EAL: No free 2048 kB hugepages reported on node 1 00:07:09.802 [2024-11-08 04:50:44.634916] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:09.802 [2024-11-08 04:50:44.704684] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:09.802 [2024-11-08 04:50:44.704779] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:09.802 [2024-11-08 04:50:44.704863] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:09.802 [2024-11-08 04:50:44.704864] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=0xf 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=decompress 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=software 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@23 -- # accel_module=software 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=32 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=32 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=1 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val=Yes 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:09.802 04:50:44 -- accel/accel.sh@21 -- # val= 00:07:09.802 04:50:44 -- accel/accel.sh@22 -- # case "$var" in 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # IFS=: 00:07:09.802 04:50:44 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@21 -- # val= 00:07:11.177 04:50:45 -- accel/accel.sh@22 -- # case "$var" in 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # IFS=: 00:07:11.177 04:50:45 -- accel/accel.sh@20 -- # read -r var val 00:07:11.177 04:50:45 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:11.177 04:50:45 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:11.177 04:50:45 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:11.177 00:07:11.177 real 0m2.681s 00:07:11.177 user 0m9.074s 00:07:11.177 sys 0m0.267s 00:07:11.177 04:50:45 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:11.177 04:50:45 -- common/autotest_common.sh@10 -- # set +x 00:07:11.177 ************************************ 00:07:11.177 END TEST accel_decomp_mcore 00:07:11.177 ************************************ 00:07:11.177 04:50:45 -- accel/accel.sh@112 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.177 04:50:45 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:11.177 04:50:45 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:11.177 04:50:45 -- common/autotest_common.sh@10 -- # set +x 00:07:11.177 ************************************ 00:07:11.177 START TEST accel_decomp_full_mcore 00:07:11.177 ************************************ 00:07:11.177 04:50:45 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.177 04:50:45 -- accel/accel.sh@16 -- # local accel_opc 00:07:11.177 04:50:45 -- accel/accel.sh@17 -- # local accel_module 00:07:11.177 04:50:45 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.177 04:50:45 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:11.177 04:50:45 -- accel/accel.sh@12 -- # build_accel_config 00:07:11.177 04:50:45 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:11.177 04:50:45 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:11.177 04:50:45 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:11.177 04:50:45 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:11.177 04:50:45 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:11.177 04:50:45 -- accel/accel.sh@41 -- # local IFS=, 00:07:11.177 04:50:45 -- accel/accel.sh@42 -- # jq -r . 00:07:11.177 [2024-11-08 04:50:45.961906] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:11.177 [2024-11-08 04:50:45.961997] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3675407 ] 00:07:11.177 EAL: No free 2048 kB hugepages reported on node 1 00:07:11.177 [2024-11-08 04:50:46.030915] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:11.177 [2024-11-08 04:50:46.101192] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.177 [2024-11-08 04:50:46.101287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:11.177 [2024-11-08 04:50:46.101376] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:11.177 [2024-11-08 04:50:46.101378] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.553 04:50:47 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:12.553 00:07:12.553 SPDK Configuration: 00:07:12.553 Core mask: 0xf 00:07:12.553 00:07:12.553 Accel Perf Configuration: 00:07:12.553 Workload Type: decompress 00:07:12.553 Transfer size: 111250 bytes 00:07:12.553 Vector count 1 00:07:12.553 Module: software 00:07:12.553 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.553 Queue depth: 32 00:07:12.553 Allocate depth: 32 00:07:12.553 # threads/core: 1 00:07:12.553 Run time: 1 seconds 00:07:12.553 Verify: Yes 00:07:12.553 00:07:12.553 Running for 1 seconds... 00:07:12.553 00:07:12.553 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:12.553 ------------------------------------------------------------------------------------ 00:07:12.553 0,0 5792/s 239 MiB/s 0 0 00:07:12.553 3,0 5824/s 240 MiB/s 0 0 00:07:12.553 2,0 5824/s 240 MiB/s 0 0 00:07:12.553 1,0 5824/s 240 MiB/s 0 0 00:07:12.553 ==================================================================================== 00:07:12.553 Total 23264/s 2468 MiB/s 0 0' 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.553 04:50:47 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:12.553 04:50:47 -- accel/accel.sh@12 -- # build_accel_config 00:07:12.553 04:50:47 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:12.553 04:50:47 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.553 04:50:47 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.553 04:50:47 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:12.553 04:50:47 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:12.553 04:50:47 -- accel/accel.sh@41 -- # local IFS=, 00:07:12.553 04:50:47 -- accel/accel.sh@42 -- # jq -r . 00:07:12.553 [2024-11-08 04:50:47.315242] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:12.553 [2024-11-08 04:50:47.315330] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3675682 ] 00:07:12.553 EAL: No free 2048 kB hugepages reported on node 1 00:07:12.553 [2024-11-08 04:50:47.384029] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:12.553 [2024-11-08 04:50:47.454105] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:12.553 [2024-11-08 04:50:47.454196] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:12.553 [2024-11-08 04:50:47.454281] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 3 00:07:12.553 [2024-11-08 04:50:47.454283] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=0xf 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=decompress 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=software 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@23 -- # accel_module=software 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=32 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=32 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=1 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val=Yes 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:12.553 04:50:47 -- accel/accel.sh@21 -- # val= 00:07:12.553 04:50:47 -- accel/accel.sh@22 -- # case "$var" in 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # IFS=: 00:07:12.553 04:50:47 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@21 -- # val= 00:07:13.929 04:50:48 -- accel/accel.sh@22 -- # case "$var" in 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # IFS=: 00:07:13.929 04:50:48 -- accel/accel.sh@20 -- # read -r var val 00:07:13.929 04:50:48 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:13.929 04:50:48 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:13.929 04:50:48 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:13.929 00:07:13.929 real 0m2.711s 00:07:13.929 user 0m9.126s 00:07:13.929 sys 0m0.298s 00:07:13.929 04:50:48 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:13.929 04:50:48 -- common/autotest_common.sh@10 -- # set +x 00:07:13.929 ************************************ 00:07:13.929 END TEST accel_decomp_full_mcore 00:07:13.929 ************************************ 00:07:13.929 04:50:48 -- accel/accel.sh@113 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.929 04:50:48 -- common/autotest_common.sh@1087 -- # '[' 11 -le 1 ']' 00:07:13.929 04:50:48 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:13.929 04:50:48 -- common/autotest_common.sh@10 -- # set +x 00:07:13.929 ************************************ 00:07:13.929 START TEST accel_decomp_mthread 00:07:13.929 ************************************ 00:07:13.929 04:50:48 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.929 04:50:48 -- accel/accel.sh@16 -- # local accel_opc 00:07:13.929 04:50:48 -- accel/accel.sh@17 -- # local accel_module 00:07:13.929 04:50:48 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.929 04:50:48 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:13.929 04:50:48 -- accel/accel.sh@12 -- # build_accel_config 00:07:13.929 04:50:48 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:13.929 04:50:48 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:13.929 04:50:48 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:13.929 04:50:48 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:13.929 04:50:48 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:13.929 04:50:48 -- accel/accel.sh@41 -- # local IFS=, 00:07:13.929 04:50:48 -- accel/accel.sh@42 -- # jq -r . 00:07:13.929 [2024-11-08 04:50:48.707980] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:13.930 [2024-11-08 04:50:48.708044] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3675968 ] 00:07:13.930 EAL: No free 2048 kB hugepages reported on node 1 00:07:13.930 [2024-11-08 04:50:48.771757] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.930 [2024-11-08 04:50:48.839186] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.306 04:50:50 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:15.306 00:07:15.306 SPDK Configuration: 00:07:15.306 Core mask: 0x1 00:07:15.306 00:07:15.306 Accel Perf Configuration: 00:07:15.306 Workload Type: decompress 00:07:15.306 Transfer size: 4096 bytes 00:07:15.306 Vector count 1 00:07:15.306 Module: software 00:07:15.306 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.306 Queue depth: 32 00:07:15.306 Allocate depth: 32 00:07:15.306 # threads/core: 2 00:07:15.306 Run time: 1 seconds 00:07:15.306 Verify: Yes 00:07:15.306 00:07:15.306 Running for 1 seconds... 00:07:15.306 00:07:15.306 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:15.306 ------------------------------------------------------------------------------------ 00:07:15.306 0,1 46464/s 85 MiB/s 0 0 00:07:15.306 0,0 46336/s 85 MiB/s 0 0 00:07:15.306 ==================================================================================== 00:07:15.306 Total 92800/s 362 MiB/s 0 0' 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:15.306 04:50:50 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:15.306 04:50:50 -- accel/accel.sh@12 -- # build_accel_config 00:07:15.306 04:50:50 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:15.306 04:50:50 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:15.306 04:50:50 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:15.306 04:50:50 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:15.306 04:50:50 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:15.306 04:50:50 -- accel/accel.sh@41 -- # local IFS=, 00:07:15.306 04:50:50 -- accel/accel.sh@42 -- # jq -r . 00:07:15.306 [2024-11-08 04:50:50.037479] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:15.306 [2024-11-08 04:50:50.037581] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3676122 ] 00:07:15.306 EAL: No free 2048 kB hugepages reported on node 1 00:07:15.306 [2024-11-08 04:50:50.109651] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.306 [2024-11-08 04:50:50.182408] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val=0x1 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val=decompress 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val='4096 bytes' 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.306 04:50:50 -- accel/accel.sh@21 -- # val=software 00:07:15.306 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.306 04:50:50 -- accel/accel.sh@23 -- # accel_module=software 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.306 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val=32 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val=32 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val=2 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val=Yes 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:15.307 04:50:50 -- accel/accel.sh@21 -- # val= 00:07:15.307 04:50:50 -- accel/accel.sh@22 -- # case "$var" in 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # IFS=: 00:07:15.307 04:50:50 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@21 -- # val= 00:07:16.683 04:50:51 -- accel/accel.sh@22 -- # case "$var" in 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # IFS=: 00:07:16.683 04:50:51 -- accel/accel.sh@20 -- # read -r var val 00:07:16.683 04:50:51 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:16.683 04:50:51 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:16.683 04:50:51 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:16.683 00:07:16.683 real 0m2.669s 00:07:16.683 user 0m2.428s 00:07:16.683 sys 0m0.251s 00:07:16.683 04:50:51 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:16.683 04:50:51 -- common/autotest_common.sh@10 -- # set +x 00:07:16.683 ************************************ 00:07:16.683 END TEST accel_decomp_mthread 00:07:16.683 ************************************ 00:07:16.683 04:50:51 -- accel/accel.sh@114 -- # run_test accel_deomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:16.683 04:50:51 -- common/autotest_common.sh@1087 -- # '[' 13 -le 1 ']' 00:07:16.683 04:50:51 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:16.683 04:50:51 -- common/autotest_common.sh@10 -- # set +x 00:07:16.683 ************************************ 00:07:16.683 START TEST accel_deomp_full_mthread 00:07:16.683 ************************************ 00:07:16.683 04:50:51 -- common/autotest_common.sh@1114 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:16.683 04:50:51 -- accel/accel.sh@16 -- # local accel_opc 00:07:16.684 04:50:51 -- accel/accel.sh@17 -- # local accel_module 00:07:16.684 04:50:51 -- accel/accel.sh@18 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:16.684 04:50:51 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:16.684 04:50:51 -- accel/accel.sh@12 -- # build_accel_config 00:07:16.684 04:50:51 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:16.684 04:50:51 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.684 04:50:51 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.684 04:50:51 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:16.684 04:50:51 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:16.684 04:50:51 -- accel/accel.sh@41 -- # local IFS=, 00:07:16.684 04:50:51 -- accel/accel.sh@42 -- # jq -r . 00:07:16.684 [2024-11-08 04:50:51.435645] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:16.684 [2024-11-08 04:50:51.435719] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3676342 ] 00:07:16.684 EAL: No free 2048 kB hugepages reported on node 1 00:07:16.684 [2024-11-08 04:50:51.503605] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:16.684 [2024-11-08 04:50:51.572755] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.060 04:50:52 -- accel/accel.sh@18 -- # out='Preparing input file... 00:07:18.060 00:07:18.060 SPDK Configuration: 00:07:18.060 Core mask: 0x1 00:07:18.060 00:07:18.060 Accel Perf Configuration: 00:07:18.060 Workload Type: decompress 00:07:18.060 Transfer size: 111250 bytes 00:07:18.060 Vector count 1 00:07:18.060 Module: software 00:07:18.060 File Name: /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.060 Queue depth: 32 00:07:18.060 Allocate depth: 32 00:07:18.060 # threads/core: 2 00:07:18.060 Run time: 1 seconds 00:07:18.060 Verify: Yes 00:07:18.060 00:07:18.060 Running for 1 seconds... 00:07:18.060 00:07:18.060 Core,Thread Transfers Bandwidth Failed Miscompares 00:07:18.060 ------------------------------------------------------------------------------------ 00:07:18.060 0,1 2976/s 122 MiB/s 0 0 00:07:18.060 0,0 2976/s 122 MiB/s 0 0 00:07:18.060 ==================================================================================== 00:07:18.060 Total 5952/s 631 MiB/s 0 0' 00:07:18.060 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.060 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.060 04:50:52 -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.060 04:50:52 -- accel/accel.sh@12 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:18.060 04:50:52 -- accel/accel.sh@12 -- # build_accel_config 00:07:18.060 04:50:52 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:18.060 04:50:52 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.060 04:50:52 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.060 04:50:52 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:18.060 04:50:52 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:18.060 04:50:52 -- accel/accel.sh@41 -- # local IFS=, 00:07:18.060 04:50:52 -- accel/accel.sh@42 -- # jq -r . 00:07:18.060 [2024-11-08 04:50:52.784798] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:18.061 [2024-11-08 04:50:52.784889] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3676547 ] 00:07:18.061 EAL: No free 2048 kB hugepages reported on node 1 00:07:18.061 [2024-11-08 04:50:52.854584] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:18.061 [2024-11-08 04:50:52.922072] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=0x1 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=decompress 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@24 -- # accel_opc=decompress 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val='111250 bytes' 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=software 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@23 -- # accel_module=software 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/bib 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=32 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=32 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=2 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val='1 seconds' 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val=Yes 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:18.061 04:50:52 -- accel/accel.sh@21 -- # val= 00:07:18.061 04:50:52 -- accel/accel.sh@22 -- # case "$var" in 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # IFS=: 00:07:18.061 04:50:52 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@21 -- # val= 00:07:19.436 04:50:54 -- accel/accel.sh@22 -- # case "$var" in 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # IFS=: 00:07:19.436 04:50:54 -- accel/accel.sh@20 -- # read -r var val 00:07:19.436 04:50:54 -- accel/accel.sh@28 -- # [[ -n software ]] 00:07:19.436 04:50:54 -- accel/accel.sh@28 -- # [[ -n decompress ]] 00:07:19.437 04:50:54 -- accel/accel.sh@28 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:19.437 00:07:19.437 real 0m2.702s 00:07:19.437 user 0m2.454s 00:07:19.437 sys 0m0.246s 00:07:19.437 04:50:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:19.437 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.437 ************************************ 00:07:19.437 END TEST accel_deomp_full_mthread 00:07:19.437 ************************************ 00:07:19.437 04:50:54 -- accel/accel.sh@116 -- # [[ n == y ]] 00:07:19.437 04:50:54 -- accel/accel.sh@129 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:19.437 04:50:54 -- common/autotest_common.sh@1087 -- # '[' 4 -le 1 ']' 00:07:19.437 04:50:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:19.437 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.437 04:50:54 -- accel/accel.sh@129 -- # build_accel_config 00:07:19.437 04:50:54 -- accel/accel.sh@32 -- # accel_json_cfg=() 00:07:19.437 04:50:54 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:19.437 04:50:54 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:19.437 04:50:54 -- accel/accel.sh@35 -- # [[ 0 -gt 0 ]] 00:07:19.437 04:50:54 -- accel/accel.sh@37 -- # [[ -n '' ]] 00:07:19.437 04:50:54 -- accel/accel.sh@41 -- # local IFS=, 00:07:19.437 04:50:54 -- accel/accel.sh@42 -- # jq -r . 00:07:19.437 ************************************ 00:07:19.437 START TEST accel_dif_functional_tests 00:07:19.437 ************************************ 00:07:19.437 04:50:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:19.437 [2024-11-08 04:50:54.181967] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:19.437 [2024-11-08 04:50:54.182073] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3676831 ] 00:07:19.437 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.437 [2024-11-08 04:50:54.250084] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:19.437 [2024-11-08 04:50:54.319627] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 1 00:07:19.437 [2024-11-08 04:50:54.319713] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 2 00:07:19.437 [2024-11-08 04:50:54.319714] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.437 00:07:19.437 00:07:19.437 CUnit - A unit testing framework for C - Version 2.1-3 00:07:19.437 http://cunit.sourceforge.net/ 00:07:19.437 00:07:19.437 00:07:19.437 Suite: accel_dif 00:07:19.437 Test: verify: DIF generated, GUARD check ...passed 00:07:19.437 Test: verify: DIF generated, APPTAG check ...passed 00:07:19.437 Test: verify: DIF generated, REFTAG check ...passed 00:07:19.437 Test: verify: DIF not generated, GUARD check ...[2024-11-08 04:50:54.388899] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:19.437 [2024-11-08 04:50:54.388949] dif.c: 779:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:19.437 passed 00:07:19.437 Test: verify: DIF not generated, APPTAG check ...[2024-11-08 04:50:54.388985] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:19.437 [2024-11-08 04:50:54.389003] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:19.437 passed 00:07:19.437 Test: verify: DIF not generated, REFTAG check ...[2024-11-08 04:50:54.389025] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:19.437 [2024-11-08 04:50:54.389044] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:19.437 passed 00:07:19.437 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:19.437 Test: verify: APPTAG incorrect, APPTAG check ...[2024-11-08 04:50:54.389089] dif.c: 794:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:19.437 passed 00:07:19.437 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:19.437 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:19.437 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:19.437 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-11-08 04:50:54.389190] dif.c: 815:_dif_verify: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:19.437 passed 00:07:19.437 Test: generate copy: DIF generated, GUARD check ...passed 00:07:19.437 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:19.437 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:19.437 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:19.437 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:19.437 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:19.437 Test: generate copy: iovecs-len validate ...[2024-11-08 04:50:54.389369] dif.c:1167:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:19.437 passed 00:07:19.437 Test: generate copy: buffer alignment validate ...passed 00:07:19.437 00:07:19.437 Run Summary: Type Total Ran Passed Failed Inactive 00:07:19.437 suites 1 1 n/a 0 0 00:07:19.437 tests 20 20 20 0 0 00:07:19.437 asserts 204 204 204 0 n/a 00:07:19.437 00:07:19.437 Elapsed time = 0.000 seconds 00:07:19.695 00:07:19.695 real 0m0.392s 00:07:19.695 user 0m0.585s 00:07:19.695 sys 0m0.164s 00:07:19.695 04:50:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:19.695 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.695 ************************************ 00:07:19.696 END TEST accel_dif_functional_tests 00:07:19.696 ************************************ 00:07:19.696 00:07:19.696 real 0m57.209s 00:07:19.696 user 1m4.874s 00:07:19.696 sys 0m7.068s 00:07:19.696 04:50:54 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:19.696 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.696 ************************************ 00:07:19.696 END TEST accel 00:07:19.696 ************************************ 00:07:19.696 04:50:54 -- spdk/autotest.sh@177 -- # run_test accel_rpc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:19.696 04:50:54 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:19.696 04:50:54 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:19.696 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.696 ************************************ 00:07:19.696 START TEST accel_rpc 00:07:19.696 ************************************ 00:07:19.696 04:50:54 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:19.696 * Looking for test storage... 00:07:19.696 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/accel 00:07:19.696 04:50:54 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:19.696 04:50:54 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:19.696 04:50:54 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:19.696 04:50:54 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:19.696 04:50:54 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:19.696 04:50:54 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:19.696 04:50:54 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:19.696 04:50:54 -- scripts/common.sh@335 -- # IFS=.-: 00:07:19.696 04:50:54 -- scripts/common.sh@335 -- # read -ra ver1 00:07:19.696 04:50:54 -- scripts/common.sh@336 -- # IFS=.-: 00:07:19.696 04:50:54 -- scripts/common.sh@336 -- # read -ra ver2 00:07:19.696 04:50:54 -- scripts/common.sh@337 -- # local 'op=<' 00:07:19.696 04:50:54 -- scripts/common.sh@339 -- # ver1_l=2 00:07:19.696 04:50:54 -- scripts/common.sh@340 -- # ver2_l=1 00:07:19.696 04:50:54 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:19.696 04:50:54 -- scripts/common.sh@343 -- # case "$op" in 00:07:19.696 04:50:54 -- scripts/common.sh@344 -- # : 1 00:07:19.696 04:50:54 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:19.696 04:50:54 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:19.696 04:50:54 -- scripts/common.sh@364 -- # decimal 1 00:07:19.954 04:50:54 -- scripts/common.sh@352 -- # local d=1 00:07:19.954 04:50:54 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:19.954 04:50:54 -- scripts/common.sh@354 -- # echo 1 00:07:19.954 04:50:54 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:19.954 04:50:54 -- scripts/common.sh@365 -- # decimal 2 00:07:19.954 04:50:54 -- scripts/common.sh@352 -- # local d=2 00:07:19.954 04:50:54 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:19.954 04:50:54 -- scripts/common.sh@354 -- # echo 2 00:07:19.954 04:50:54 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:19.954 04:50:54 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:19.954 04:50:54 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:19.954 04:50:54 -- scripts/common.sh@367 -- # return 0 00:07:19.954 04:50:54 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:19.954 04:50:54 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:19.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.954 --rc genhtml_branch_coverage=1 00:07:19.954 --rc genhtml_function_coverage=1 00:07:19.954 --rc genhtml_legend=1 00:07:19.954 --rc geninfo_all_blocks=1 00:07:19.954 --rc geninfo_unexecuted_blocks=1 00:07:19.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.954 ' 00:07:19.954 04:50:54 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:19.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.954 --rc genhtml_branch_coverage=1 00:07:19.954 --rc genhtml_function_coverage=1 00:07:19.954 --rc genhtml_legend=1 00:07:19.954 --rc geninfo_all_blocks=1 00:07:19.954 --rc geninfo_unexecuted_blocks=1 00:07:19.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.954 ' 00:07:19.954 04:50:54 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:19.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.954 --rc genhtml_branch_coverage=1 00:07:19.954 --rc genhtml_function_coverage=1 00:07:19.954 --rc genhtml_legend=1 00:07:19.954 --rc geninfo_all_blocks=1 00:07:19.954 --rc geninfo_unexecuted_blocks=1 00:07:19.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.954 ' 00:07:19.954 04:50:54 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:19.954 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:19.954 --rc genhtml_branch_coverage=1 00:07:19.954 --rc genhtml_function_coverage=1 00:07:19.954 --rc genhtml_legend=1 00:07:19.954 --rc geninfo_all_blocks=1 00:07:19.954 --rc geninfo_unexecuted_blocks=1 00:07:19.954 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:19.954 ' 00:07:19.955 04:50:54 -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:19.955 04:50:54 -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=3677115 00:07:19.955 04:50:54 -- accel/accel_rpc.sh@15 -- # waitforlisten 3677115 00:07:19.955 04:50:54 -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:19.955 04:50:54 -- common/autotest_common.sh@829 -- # '[' -z 3677115 ']' 00:07:19.955 04:50:54 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:19.955 04:50:54 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:19.955 04:50:54 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:19.955 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:19.955 04:50:54 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:19.955 04:50:54 -- common/autotest_common.sh@10 -- # set +x 00:07:19.955 [2024-11-08 04:50:54.840840] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:19.955 [2024-11-08 04:50:54.840909] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3677115 ] 00:07:19.955 EAL: No free 2048 kB hugepages reported on node 1 00:07:19.955 [2024-11-08 04:50:54.907650] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:19.955 [2024-11-08 04:50:54.981913] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:19.955 [2024-11-08 04:50:54.982019] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.889 04:50:55 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:20.889 04:50:55 -- common/autotest_common.sh@862 -- # return 0 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:20.889 04:50:55 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:20.889 04:50:55 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:20.889 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.889 ************************************ 00:07:20.889 START TEST accel_assign_opcode 00:07:20.889 ************************************ 00:07:20.889 04:50:55 -- common/autotest_common.sh@1114 -- # accel_assign_opcode_test_suite 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:20.889 04:50:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:20.889 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.889 [2024-11-08 04:50:55.680071] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:20.889 04:50:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:20.889 04:50:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:20.889 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.889 [2024-11-08 04:50:55.688087] accel_rpc.c: 168:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:20.889 04:50:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:20.889 04:50:55 -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:20.889 04:50:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:20.890 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.890 04:50:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:20.890 04:50:55 -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:20.890 04:50:55 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:20.890 04:50:55 -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:20.890 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.890 04:50:55 -- accel/accel_rpc.sh@42 -- # grep software 00:07:20.890 04:50:55 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:20.890 software 00:07:20.890 00:07:20.890 real 0m0.228s 00:07:20.890 user 0m0.035s 00:07:20.890 sys 0m0.014s 00:07:20.890 04:50:55 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:20.890 04:50:55 -- common/autotest_common.sh@10 -- # set +x 00:07:20.890 ************************************ 00:07:20.890 END TEST accel_assign_opcode 00:07:20.890 ************************************ 00:07:20.890 04:50:55 -- accel/accel_rpc.sh@55 -- # killprocess 3677115 00:07:20.890 04:50:55 -- common/autotest_common.sh@936 -- # '[' -z 3677115 ']' 00:07:20.890 04:50:55 -- common/autotest_common.sh@940 -- # kill -0 3677115 00:07:20.890 04:50:55 -- common/autotest_common.sh@941 -- # uname 00:07:20.890 04:50:55 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:20.890 04:50:55 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3677115 00:07:21.148 04:50:56 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:21.148 04:50:56 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:21.148 04:50:56 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3677115' 00:07:21.148 killing process with pid 3677115 00:07:21.148 04:50:56 -- common/autotest_common.sh@955 -- # kill 3677115 00:07:21.148 04:50:56 -- common/autotest_common.sh@960 -- # wait 3677115 00:07:21.406 00:07:21.406 real 0m1.679s 00:07:21.407 user 0m1.701s 00:07:21.407 sys 0m0.489s 00:07:21.407 04:50:56 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:21.407 04:50:56 -- common/autotest_common.sh@10 -- # set +x 00:07:21.407 ************************************ 00:07:21.407 END TEST accel_rpc 00:07:21.407 ************************************ 00:07:21.407 04:50:56 -- spdk/autotest.sh@178 -- # run_test app_cmdline /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:21.407 04:50:56 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:21.407 04:50:56 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:21.407 04:50:56 -- common/autotest_common.sh@10 -- # set +x 00:07:21.407 ************************************ 00:07:21.407 START TEST app_cmdline 00:07:21.407 ************************************ 00:07:21.407 04:50:56 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/cmdline.sh 00:07:21.407 * Looking for test storage... 00:07:21.407 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:21.407 04:50:56 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:21.407 04:50:56 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:21.407 04:50:56 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:21.665 04:50:56 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:21.665 04:50:56 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:21.665 04:50:56 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:21.665 04:50:56 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:21.665 04:50:56 -- scripts/common.sh@335 -- # IFS=.-: 00:07:21.665 04:50:56 -- scripts/common.sh@335 -- # read -ra ver1 00:07:21.665 04:50:56 -- scripts/common.sh@336 -- # IFS=.-: 00:07:21.665 04:50:56 -- scripts/common.sh@336 -- # read -ra ver2 00:07:21.665 04:50:56 -- scripts/common.sh@337 -- # local 'op=<' 00:07:21.665 04:50:56 -- scripts/common.sh@339 -- # ver1_l=2 00:07:21.665 04:50:56 -- scripts/common.sh@340 -- # ver2_l=1 00:07:21.665 04:50:56 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:21.665 04:50:56 -- scripts/common.sh@343 -- # case "$op" in 00:07:21.665 04:50:56 -- scripts/common.sh@344 -- # : 1 00:07:21.665 04:50:56 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:21.665 04:50:56 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:21.665 04:50:56 -- scripts/common.sh@364 -- # decimal 1 00:07:21.665 04:50:56 -- scripts/common.sh@352 -- # local d=1 00:07:21.666 04:50:56 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:21.666 04:50:56 -- scripts/common.sh@354 -- # echo 1 00:07:21.666 04:50:56 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:21.666 04:50:56 -- scripts/common.sh@365 -- # decimal 2 00:07:21.666 04:50:56 -- scripts/common.sh@352 -- # local d=2 00:07:21.666 04:50:56 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:21.666 04:50:56 -- scripts/common.sh@354 -- # echo 2 00:07:21.666 04:50:56 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:21.666 04:50:56 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:21.666 04:50:56 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:21.666 04:50:56 -- scripts/common.sh@367 -- # return 0 00:07:21.666 04:50:56 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:21.666 04:50:56 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:21.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:21.666 --rc genhtml_branch_coverage=1 00:07:21.666 --rc genhtml_function_coverage=1 00:07:21.666 --rc genhtml_legend=1 00:07:21.666 --rc geninfo_all_blocks=1 00:07:21.666 --rc geninfo_unexecuted_blocks=1 00:07:21.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:21.666 ' 00:07:21.666 04:50:56 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:21.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:21.666 --rc genhtml_branch_coverage=1 00:07:21.666 --rc genhtml_function_coverage=1 00:07:21.666 --rc genhtml_legend=1 00:07:21.666 --rc geninfo_all_blocks=1 00:07:21.666 --rc geninfo_unexecuted_blocks=1 00:07:21.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:21.666 ' 00:07:21.666 04:50:56 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:21.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:21.666 --rc genhtml_branch_coverage=1 00:07:21.666 --rc genhtml_function_coverage=1 00:07:21.666 --rc genhtml_legend=1 00:07:21.666 --rc geninfo_all_blocks=1 00:07:21.666 --rc geninfo_unexecuted_blocks=1 00:07:21.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:21.666 ' 00:07:21.666 04:50:56 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:21.666 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:21.666 --rc genhtml_branch_coverage=1 00:07:21.666 --rc genhtml_function_coverage=1 00:07:21.666 --rc genhtml_legend=1 00:07:21.666 --rc geninfo_all_blocks=1 00:07:21.666 --rc geninfo_unexecuted_blocks=1 00:07:21.666 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:21.666 ' 00:07:21.666 04:50:56 -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:21.666 04:50:56 -- app/cmdline.sh@17 -- # spdk_tgt_pid=3677509 00:07:21.666 04:50:56 -- app/cmdline.sh@18 -- # waitforlisten 3677509 00:07:21.666 04:50:56 -- app/cmdline.sh@16 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:21.666 04:50:56 -- common/autotest_common.sh@829 -- # '[' -z 3677509 ']' 00:07:21.666 04:50:56 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:21.666 04:50:56 -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:21.666 04:50:56 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:21.666 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:21.666 04:50:56 -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:21.666 04:50:56 -- common/autotest_common.sh@10 -- # set +x 00:07:21.666 [2024-11-08 04:50:56.563103] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:21.666 [2024-11-08 04:50:56.563179] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3677509 ] 00:07:21.666 EAL: No free 2048 kB hugepages reported on node 1 00:07:21.666 [2024-11-08 04:50:56.628789] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:21.666 [2024-11-08 04:50:56.696853] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:21.666 [2024-11-08 04:50:56.696956] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.601 04:50:57 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:22.601 04:50:57 -- common/autotest_common.sh@862 -- # return 0 00:07:22.601 04:50:57 -- app/cmdline.sh@20 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:22.601 { 00:07:22.601 "version": "SPDK v24.01.1-pre git sha1 c13c99a5e", 00:07:22.601 "fields": { 00:07:22.601 "major": 24, 00:07:22.601 "minor": 1, 00:07:22.601 "patch": 1, 00:07:22.601 "suffix": "-pre", 00:07:22.601 "commit": "c13c99a5e" 00:07:22.601 } 00:07:22.601 } 00:07:22.601 04:50:57 -- app/cmdline.sh@22 -- # expected_methods=() 00:07:22.601 04:50:57 -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:22.601 04:50:57 -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:22.601 04:50:57 -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:22.601 04:50:57 -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:22.602 04:50:57 -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:22.602 04:50:57 -- common/autotest_common.sh@10 -- # set +x 00:07:22.602 04:50:57 -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:22.602 04:50:57 -- app/cmdline.sh@26 -- # sort 00:07:22.602 04:50:57 -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:22.602 04:50:57 -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:22.602 04:50:57 -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:22.602 04:50:57 -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:22.602 04:50:57 -- common/autotest_common.sh@650 -- # local es=0 00:07:22.602 04:50:57 -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:22.602 04:50:57 -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:22.602 04:50:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.602 04:50:57 -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:22.602 04:50:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.602 04:50:57 -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:22.602 04:50:57 -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:22.602 04:50:57 -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py 00:07:22.602 04:50:57 -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py ]] 00:07:22.602 04:50:57 -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:22.860 request: 00:07:22.860 { 00:07:22.860 "method": "env_dpdk_get_mem_stats", 00:07:22.860 "req_id": 1 00:07:22.860 } 00:07:22.860 Got JSON-RPC error response 00:07:22.860 response: 00:07:22.860 { 00:07:22.860 "code": -32601, 00:07:22.860 "message": "Method not found" 00:07:22.860 } 00:07:22.860 04:50:57 -- common/autotest_common.sh@653 -- # es=1 00:07:22.860 04:50:57 -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:22.860 04:50:57 -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:22.860 04:50:57 -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:22.860 04:50:57 -- app/cmdline.sh@1 -- # killprocess 3677509 00:07:22.860 04:50:57 -- common/autotest_common.sh@936 -- # '[' -z 3677509 ']' 00:07:22.860 04:50:57 -- common/autotest_common.sh@940 -- # kill -0 3677509 00:07:22.860 04:50:57 -- common/autotest_common.sh@941 -- # uname 00:07:22.860 04:50:57 -- common/autotest_common.sh@941 -- # '[' Linux = Linux ']' 00:07:22.860 04:50:57 -- common/autotest_common.sh@942 -- # ps --no-headers -o comm= 3677509 00:07:22.860 04:50:57 -- common/autotest_common.sh@942 -- # process_name=reactor_0 00:07:22.860 04:50:57 -- common/autotest_common.sh@946 -- # '[' reactor_0 = sudo ']' 00:07:22.860 04:50:57 -- common/autotest_common.sh@954 -- # echo 'killing process with pid 3677509' 00:07:22.860 killing process with pid 3677509 00:07:22.860 04:50:57 -- common/autotest_common.sh@955 -- # kill 3677509 00:07:22.860 04:50:57 -- common/autotest_common.sh@960 -- # wait 3677509 00:07:23.120 00:07:23.120 real 0m1.778s 00:07:23.120 user 0m2.038s 00:07:23.120 sys 0m0.509s 00:07:23.120 04:50:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.120 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.120 ************************************ 00:07:23.120 END TEST app_cmdline 00:07:23.120 ************************************ 00:07:23.120 04:50:58 -- spdk/autotest.sh@179 -- # run_test version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:23.120 04:50:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:23.120 04:50:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.120 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.120 ************************************ 00:07:23.120 START TEST version 00:07:23.120 ************************************ 00:07:23.120 04:50:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/version.sh 00:07:23.382 * Looking for test storage... 00:07:23.382 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:23.382 04:50:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:23.382 04:50:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:23.382 04:50:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:23.382 04:50:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:23.382 04:50:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:23.382 04:50:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:23.382 04:50:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:23.382 04:50:58 -- scripts/common.sh@335 -- # IFS=.-: 00:07:23.382 04:50:58 -- scripts/common.sh@335 -- # read -ra ver1 00:07:23.382 04:50:58 -- scripts/common.sh@336 -- # IFS=.-: 00:07:23.382 04:50:58 -- scripts/common.sh@336 -- # read -ra ver2 00:07:23.382 04:50:58 -- scripts/common.sh@337 -- # local 'op=<' 00:07:23.382 04:50:58 -- scripts/common.sh@339 -- # ver1_l=2 00:07:23.382 04:50:58 -- scripts/common.sh@340 -- # ver2_l=1 00:07:23.382 04:50:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:23.382 04:50:58 -- scripts/common.sh@343 -- # case "$op" in 00:07:23.382 04:50:58 -- scripts/common.sh@344 -- # : 1 00:07:23.382 04:50:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:23.382 04:50:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:23.382 04:50:58 -- scripts/common.sh@364 -- # decimal 1 00:07:23.382 04:50:58 -- scripts/common.sh@352 -- # local d=1 00:07:23.382 04:50:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:23.382 04:50:58 -- scripts/common.sh@354 -- # echo 1 00:07:23.382 04:50:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:23.382 04:50:58 -- scripts/common.sh@365 -- # decimal 2 00:07:23.382 04:50:58 -- scripts/common.sh@352 -- # local d=2 00:07:23.382 04:50:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:23.382 04:50:58 -- scripts/common.sh@354 -- # echo 2 00:07:23.382 04:50:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:23.382 04:50:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:23.382 04:50:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:23.382 04:50:58 -- scripts/common.sh@367 -- # return 0 00:07:23.382 04:50:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:23.382 04:50:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:23.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.382 --rc genhtml_branch_coverage=1 00:07:23.382 --rc genhtml_function_coverage=1 00:07:23.382 --rc genhtml_legend=1 00:07:23.382 --rc geninfo_all_blocks=1 00:07:23.382 --rc geninfo_unexecuted_blocks=1 00:07:23.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.382 ' 00:07:23.382 04:50:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:23.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.382 --rc genhtml_branch_coverage=1 00:07:23.382 --rc genhtml_function_coverage=1 00:07:23.382 --rc genhtml_legend=1 00:07:23.382 --rc geninfo_all_blocks=1 00:07:23.382 --rc geninfo_unexecuted_blocks=1 00:07:23.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.382 ' 00:07:23.382 04:50:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:23.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.382 --rc genhtml_branch_coverage=1 00:07:23.382 --rc genhtml_function_coverage=1 00:07:23.382 --rc genhtml_legend=1 00:07:23.382 --rc geninfo_all_blocks=1 00:07:23.382 --rc geninfo_unexecuted_blocks=1 00:07:23.382 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.382 ' 00:07:23.382 04:50:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:23.382 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.383 --rc genhtml_branch_coverage=1 00:07:23.383 --rc genhtml_function_coverage=1 00:07:23.383 --rc genhtml_legend=1 00:07:23.383 --rc geninfo_all_blocks=1 00:07:23.383 --rc geninfo_unexecuted_blocks=1 00:07:23.383 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.383 ' 00:07:23.383 04:50:58 -- app/version.sh@17 -- # get_header_version major 00:07:23.383 04:50:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:23.383 04:50:58 -- app/version.sh@14 -- # cut -f2 00:07:23.383 04:50:58 -- app/version.sh@14 -- # tr -d '"' 00:07:23.383 04:50:58 -- app/version.sh@17 -- # major=24 00:07:23.383 04:50:58 -- app/version.sh@18 -- # get_header_version minor 00:07:23.383 04:50:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:23.383 04:50:58 -- app/version.sh@14 -- # cut -f2 00:07:23.383 04:50:58 -- app/version.sh@14 -- # tr -d '"' 00:07:23.383 04:50:58 -- app/version.sh@18 -- # minor=1 00:07:23.383 04:50:58 -- app/version.sh@19 -- # get_header_version patch 00:07:23.383 04:50:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:23.383 04:50:58 -- app/version.sh@14 -- # cut -f2 00:07:23.383 04:50:58 -- app/version.sh@14 -- # tr -d '"' 00:07:23.383 04:50:58 -- app/version.sh@19 -- # patch=1 00:07:23.383 04:50:58 -- app/version.sh@20 -- # get_header_version suffix 00:07:23.383 04:50:58 -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/version.h 00:07:23.383 04:50:58 -- app/version.sh@14 -- # cut -f2 00:07:23.383 04:50:58 -- app/version.sh@14 -- # tr -d '"' 00:07:23.383 04:50:58 -- app/version.sh@20 -- # suffix=-pre 00:07:23.383 04:50:58 -- app/version.sh@22 -- # version=24.1 00:07:23.383 04:50:58 -- app/version.sh@25 -- # (( patch != 0 )) 00:07:23.383 04:50:58 -- app/version.sh@25 -- # version=24.1.1 00:07:23.383 04:50:58 -- app/version.sh@28 -- # version=24.1.1rc0 00:07:23.383 04:50:58 -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:23.383 04:50:58 -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:23.383 04:50:58 -- app/version.sh@30 -- # py_version=24.1.1rc0 00:07:23.383 04:50:58 -- app/version.sh@31 -- # [[ 24.1.1rc0 == \2\4\.\1\.\1\r\c\0 ]] 00:07:23.383 00:07:23.383 real 0m0.246s 00:07:23.383 user 0m0.133s 00:07:23.383 sys 0m0.166s 00:07:23.383 04:50:58 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:07:23.383 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.383 ************************************ 00:07:23.383 END TEST version 00:07:23.383 ************************************ 00:07:23.383 04:50:58 -- spdk/autotest.sh@181 -- # '[' 0 -eq 1 ']' 00:07:23.383 04:50:58 -- spdk/autotest.sh@191 -- # uname -s 00:07:23.383 04:50:58 -- spdk/autotest.sh@191 -- # [[ Linux == Linux ]] 00:07:23.383 04:50:58 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:23.383 04:50:58 -- spdk/autotest.sh@192 -- # [[ 0 -eq 1 ]] 00:07:23.383 04:50:58 -- spdk/autotest.sh@204 -- # '[' 0 -eq 1 ']' 00:07:23.383 04:50:58 -- spdk/autotest.sh@251 -- # '[' 0 -eq 1 ']' 00:07:23.383 04:50:58 -- spdk/autotest.sh@255 -- # timing_exit lib 00:07:23.383 04:50:58 -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:23.383 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.699 04:50:58 -- spdk/autotest.sh@257 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@265 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@298 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@302 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@306 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@329 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@337 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@342 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:07:23.699 04:50:58 -- spdk/autotest.sh@353 -- # [[ 0 -eq 1 ]] 00:07:23.699 04:50:58 -- spdk/autotest.sh@357 -- # [[ 0 -eq 1 ]] 00:07:23.699 04:50:58 -- spdk/autotest.sh@361 -- # [[ 1 -eq 1 ]] 00:07:23.699 04:50:58 -- spdk/autotest.sh@362 -- # run_test llvm_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:23.699 04:50:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:23.699 04:50:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.699 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.699 ************************************ 00:07:23.699 START TEST llvm_fuzz 00:07:23.699 ************************************ 00:07:23.699 04:50:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm.sh 00:07:23.699 * Looking for test storage... 00:07:23.699 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz 00:07:23.699 04:50:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:23.699 04:50:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:23.699 04:50:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:23.699 04:50:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:23.699 04:50:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:23.699 04:50:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:23.699 04:50:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:23.699 04:50:58 -- scripts/common.sh@335 -- # IFS=.-: 00:07:23.699 04:50:58 -- scripts/common.sh@335 -- # read -ra ver1 00:07:23.699 04:50:58 -- scripts/common.sh@336 -- # IFS=.-: 00:07:23.699 04:50:58 -- scripts/common.sh@336 -- # read -ra ver2 00:07:23.699 04:50:58 -- scripts/common.sh@337 -- # local 'op=<' 00:07:23.699 04:50:58 -- scripts/common.sh@339 -- # ver1_l=2 00:07:23.699 04:50:58 -- scripts/common.sh@340 -- # ver2_l=1 00:07:23.699 04:50:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:23.699 04:50:58 -- scripts/common.sh@343 -- # case "$op" in 00:07:23.699 04:50:58 -- scripts/common.sh@344 -- # : 1 00:07:23.699 04:50:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:23.699 04:50:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:23.699 04:50:58 -- scripts/common.sh@364 -- # decimal 1 00:07:23.699 04:50:58 -- scripts/common.sh@352 -- # local d=1 00:07:23.699 04:50:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:23.699 04:50:58 -- scripts/common.sh@354 -- # echo 1 00:07:23.699 04:50:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:23.699 04:50:58 -- scripts/common.sh@365 -- # decimal 2 00:07:23.699 04:50:58 -- scripts/common.sh@352 -- # local d=2 00:07:23.700 04:50:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:23.700 04:50:58 -- scripts/common.sh@354 -- # echo 2 00:07:23.700 04:50:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:23.700 04:50:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:23.700 04:50:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:23.700 04:50:58 -- scripts/common.sh@367 -- # return 0 00:07:23.700 04:50:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:23.700 04:50:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:23.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.700 --rc genhtml_branch_coverage=1 00:07:23.700 --rc genhtml_function_coverage=1 00:07:23.700 --rc genhtml_legend=1 00:07:23.700 --rc geninfo_all_blocks=1 00:07:23.700 --rc geninfo_unexecuted_blocks=1 00:07:23.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.700 ' 00:07:23.700 04:50:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:23.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.700 --rc genhtml_branch_coverage=1 00:07:23.700 --rc genhtml_function_coverage=1 00:07:23.700 --rc genhtml_legend=1 00:07:23.700 --rc geninfo_all_blocks=1 00:07:23.700 --rc geninfo_unexecuted_blocks=1 00:07:23.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.700 ' 00:07:23.700 04:50:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:23.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.700 --rc genhtml_branch_coverage=1 00:07:23.700 --rc genhtml_function_coverage=1 00:07:23.700 --rc genhtml_legend=1 00:07:23.700 --rc geninfo_all_blocks=1 00:07:23.700 --rc geninfo_unexecuted_blocks=1 00:07:23.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.700 ' 00:07:23.700 04:50:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:23.700 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.700 --rc genhtml_branch_coverage=1 00:07:23.700 --rc genhtml_function_coverage=1 00:07:23.700 --rc genhtml_legend=1 00:07:23.700 --rc geninfo_all_blocks=1 00:07:23.700 --rc geninfo_unexecuted_blocks=1 00:07:23.700 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.700 ' 00:07:23.700 04:50:58 -- fuzz/llvm.sh@11 -- # fuzzers=($(get_fuzzer_targets)) 00:07:23.700 04:50:58 -- fuzz/llvm.sh@11 -- # get_fuzzer_targets 00:07:23.700 04:50:58 -- common/autotest_common.sh@548 -- # fuzzers=() 00:07:23.700 04:50:58 -- common/autotest_common.sh@548 -- # local fuzzers 00:07:23.700 04:50:58 -- common/autotest_common.sh@550 -- # [[ -n '' ]] 00:07:23.700 04:50:58 -- common/autotest_common.sh@553 -- # fuzzers=("$rootdir/test/fuzz/llvm/"*) 00:07:23.700 04:50:58 -- common/autotest_common.sh@554 -- # fuzzers=("${fuzzers[@]##*/}") 00:07:23.700 04:50:58 -- common/autotest_common.sh@557 -- # echo 'common.sh llvm-gcov.sh nvmf vfio' 00:07:23.700 04:50:58 -- fuzz/llvm.sh@13 -- # llvm_out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm 00:07:23.700 04:50:58 -- fuzz/llvm.sh@15 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/ /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/coverage 00:07:23.700 04:50:58 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:23.700 04:50:58 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:23.700 04:50:58 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:23.700 04:50:58 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:23.700 04:50:58 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:07:23.700 04:50:58 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:07:23.700 04:50:58 -- fuzz/llvm.sh@19 -- # run_test nvmf_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:23.700 04:50:58 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:07:23.700 04:50:58 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:07:23.700 04:50:58 -- common/autotest_common.sh@10 -- # set +x 00:07:23.700 ************************************ 00:07:23.700 START TEST nvmf_fuzz 00:07:23.700 ************************************ 00:07:23.700 04:50:58 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/run.sh 00:07:23.961 * Looking for test storage... 00:07:23.961 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.961 04:50:58 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:23.961 04:50:58 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:23.961 04:50:58 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:23.961 04:50:58 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:23.961 04:50:58 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:23.961 04:50:58 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:23.961 04:50:58 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:23.961 04:50:58 -- scripts/common.sh@335 -- # IFS=.-: 00:07:23.961 04:50:58 -- scripts/common.sh@335 -- # read -ra ver1 00:07:23.961 04:50:58 -- scripts/common.sh@336 -- # IFS=.-: 00:07:23.961 04:50:58 -- scripts/common.sh@336 -- # read -ra ver2 00:07:23.961 04:50:58 -- scripts/common.sh@337 -- # local 'op=<' 00:07:23.961 04:50:58 -- scripts/common.sh@339 -- # ver1_l=2 00:07:23.961 04:50:58 -- scripts/common.sh@340 -- # ver2_l=1 00:07:23.961 04:50:58 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:23.961 04:50:58 -- scripts/common.sh@343 -- # case "$op" in 00:07:23.961 04:50:58 -- scripts/common.sh@344 -- # : 1 00:07:23.961 04:50:58 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:23.961 04:50:58 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:23.961 04:50:58 -- scripts/common.sh@364 -- # decimal 1 00:07:23.961 04:50:58 -- scripts/common.sh@352 -- # local d=1 00:07:23.961 04:50:58 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:23.961 04:50:58 -- scripts/common.sh@354 -- # echo 1 00:07:23.961 04:50:58 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:23.961 04:50:58 -- scripts/common.sh@365 -- # decimal 2 00:07:23.961 04:50:58 -- scripts/common.sh@352 -- # local d=2 00:07:23.961 04:50:58 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:23.961 04:50:58 -- scripts/common.sh@354 -- # echo 2 00:07:23.961 04:50:58 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:23.961 04:50:58 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:23.961 04:50:58 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:23.961 04:50:58 -- scripts/common.sh@367 -- # return 0 00:07:23.961 04:50:58 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:23.961 04:50:58 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:23.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.961 --rc genhtml_branch_coverage=1 00:07:23.961 --rc genhtml_function_coverage=1 00:07:23.961 --rc genhtml_legend=1 00:07:23.961 --rc geninfo_all_blocks=1 00:07:23.961 --rc geninfo_unexecuted_blocks=1 00:07:23.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.961 ' 00:07:23.961 04:50:58 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:23.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.961 --rc genhtml_branch_coverage=1 00:07:23.961 --rc genhtml_function_coverage=1 00:07:23.961 --rc genhtml_legend=1 00:07:23.961 --rc geninfo_all_blocks=1 00:07:23.961 --rc geninfo_unexecuted_blocks=1 00:07:23.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.961 ' 00:07:23.961 04:50:58 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:23.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.961 --rc genhtml_branch_coverage=1 00:07:23.961 --rc genhtml_function_coverage=1 00:07:23.961 --rc genhtml_legend=1 00:07:23.961 --rc geninfo_all_blocks=1 00:07:23.961 --rc geninfo_unexecuted_blocks=1 00:07:23.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.961 ' 00:07:23.961 04:50:58 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:23.961 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:23.961 --rc genhtml_branch_coverage=1 00:07:23.961 --rc genhtml_function_coverage=1 00:07:23.961 --rc genhtml_legend=1 00:07:23.961 --rc geninfo_all_blocks=1 00:07:23.961 --rc geninfo_unexecuted_blocks=1 00:07:23.961 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:23.961 ' 00:07:23.961 04:50:58 -- nvmf/run.sh@52 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:07:23.961 04:50:58 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:07:23.961 04:50:58 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:07:23.961 04:50:58 -- common/autotest_common.sh@34 -- # set -e 00:07:23.961 04:50:58 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:07:23.961 04:50:58 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:07:23.961 04:50:58 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:07:23.961 04:50:58 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:07:23.961 04:50:58 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:07:23.961 04:50:58 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:07:23.961 04:50:58 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:07:23.961 04:50:58 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:07:23.961 04:50:58 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:07:23.962 04:50:58 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:07:23.962 04:50:58 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:07:23.962 04:50:58 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:07:23.962 04:50:58 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:07:23.962 04:50:58 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:07:23.962 04:50:58 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:07:23.962 04:50:58 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:07:23.962 04:50:58 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:07:23.962 04:50:58 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:07:23.962 04:50:58 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:07:23.962 04:50:58 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:07:23.962 04:50:58 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:07:23.962 04:50:58 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:23.962 04:50:58 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:07:23.962 04:50:58 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:07:23.962 04:50:58 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:07:23.962 04:50:58 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:07:23.962 04:50:58 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:07:23.962 04:50:58 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:07:23.962 04:50:58 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:07:23.962 04:50:58 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:07:23.962 04:50:58 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:07:23.962 04:50:58 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:07:23.962 04:50:58 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:07:23.962 04:50:58 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:07:23.962 04:50:58 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:07:23.962 04:50:58 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:23.962 04:50:58 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:07:23.962 04:50:58 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:23.962 04:50:58 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:07:23.962 04:50:58 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:07:23.962 04:50:58 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:07:23.962 04:50:58 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:07:23.962 04:50:58 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:07:23.962 04:50:58 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:07:23.962 04:50:58 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:07:23.962 04:50:58 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:07:23.962 04:50:58 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:07:23.962 04:50:58 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:07:23.962 04:50:58 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:07:23.962 04:50:58 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:07:23.962 04:50:58 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:07:23.962 04:50:58 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:07:23.962 04:50:58 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:07:23.962 04:50:58 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:07:23.962 04:50:58 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:07:23.962 04:50:58 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:07:23.962 04:50:58 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:07:23.962 04:50:58 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:07:23.962 04:50:58 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:07:23.962 04:50:58 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:07:23.962 04:50:58 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:07:23.962 04:50:58 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:07:23.962 04:50:58 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:07:23.962 04:50:58 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:07:23.962 04:50:58 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:07:23.962 04:50:58 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:07:23.962 04:50:58 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:07:23.962 04:50:58 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:07:23.962 04:50:58 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:07:23.962 04:50:58 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:07:23.962 04:50:58 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:07:23.962 04:50:58 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:07:23.962 04:50:58 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:07:23.962 04:50:58 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:07:23.962 04:50:58 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:07:23.962 04:50:58 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:07:23.962 04:50:58 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:07:23.962 04:50:58 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:07:23.962 04:50:58 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:23.962 04:50:58 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:07:23.962 04:50:58 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:23.962 04:50:58 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:07:23.962 04:50:58 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:23.962 04:50:58 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:23.962 04:50:58 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:07:23.962 04:50:58 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:23.962 04:50:58 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:07:23.962 04:50:58 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:07:23.962 04:50:58 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:07:23.962 04:50:58 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:07:23.962 04:50:58 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:07:23.962 04:50:58 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:07:23.962 04:50:58 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:07:23.962 04:50:58 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:07:23.962 #define SPDK_CONFIG_H 00:07:23.962 #define SPDK_CONFIG_APPS 1 00:07:23.962 #define SPDK_CONFIG_ARCH native 00:07:23.962 #undef SPDK_CONFIG_ASAN 00:07:23.962 #undef SPDK_CONFIG_AVAHI 00:07:23.962 #undef SPDK_CONFIG_CET 00:07:23.962 #define SPDK_CONFIG_COVERAGE 1 00:07:23.962 #define SPDK_CONFIG_CROSS_PREFIX 00:07:23.962 #undef SPDK_CONFIG_CRYPTO 00:07:23.962 #undef SPDK_CONFIG_CRYPTO_MLX5 00:07:23.962 #undef SPDK_CONFIG_CUSTOMOCF 00:07:23.962 #undef SPDK_CONFIG_DAOS 00:07:23.962 #define SPDK_CONFIG_DAOS_DIR 00:07:23.962 #define SPDK_CONFIG_DEBUG 1 00:07:23.962 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:07:23.962 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:07:23.962 #define SPDK_CONFIG_DPDK_INC_DIR 00:07:23.962 #define SPDK_CONFIG_DPDK_LIB_DIR 00:07:23.962 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:07:23.962 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:07:23.962 #define SPDK_CONFIG_EXAMPLES 1 00:07:23.962 #undef SPDK_CONFIG_FC 00:07:23.962 #define SPDK_CONFIG_FC_PATH 00:07:23.962 #define SPDK_CONFIG_FIO_PLUGIN 1 00:07:23.962 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:07:23.962 #undef SPDK_CONFIG_FUSE 00:07:23.962 #define SPDK_CONFIG_FUZZER 1 00:07:23.962 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:07:23.962 #undef SPDK_CONFIG_GOLANG 00:07:23.962 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:07:23.962 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:07:23.962 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:07:23.962 #undef SPDK_CONFIG_HAVE_LIBBSD 00:07:23.962 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:07:23.962 #define SPDK_CONFIG_IDXD 1 00:07:23.962 #define SPDK_CONFIG_IDXD_KERNEL 1 00:07:23.962 #undef SPDK_CONFIG_IPSEC_MB 00:07:23.962 #define SPDK_CONFIG_IPSEC_MB_DIR 00:07:23.962 #define SPDK_CONFIG_ISAL 1 00:07:23.962 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:07:23.962 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:07:23.962 #define SPDK_CONFIG_LIBDIR 00:07:23.962 #undef SPDK_CONFIG_LTO 00:07:23.962 #define SPDK_CONFIG_MAX_LCORES 00:07:23.962 #define SPDK_CONFIG_NVME_CUSE 1 00:07:23.962 #undef SPDK_CONFIG_OCF 00:07:23.962 #define SPDK_CONFIG_OCF_PATH 00:07:23.962 #define SPDK_CONFIG_OPENSSL_PATH 00:07:23.962 #undef SPDK_CONFIG_PGO_CAPTURE 00:07:23.962 #undef SPDK_CONFIG_PGO_USE 00:07:23.962 #define SPDK_CONFIG_PREFIX /usr/local 00:07:23.962 #undef SPDK_CONFIG_RAID5F 00:07:23.962 #undef SPDK_CONFIG_RBD 00:07:23.962 #define SPDK_CONFIG_RDMA 1 00:07:23.962 #define SPDK_CONFIG_RDMA_PROV verbs 00:07:23.962 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:07:23.962 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:07:23.962 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:07:23.962 #undef SPDK_CONFIG_SHARED 00:07:23.962 #undef SPDK_CONFIG_SMA 00:07:23.962 #define SPDK_CONFIG_TESTS 1 00:07:23.962 #undef SPDK_CONFIG_TSAN 00:07:23.962 #define SPDK_CONFIG_UBLK 1 00:07:23.962 #define SPDK_CONFIG_UBSAN 1 00:07:23.962 #undef SPDK_CONFIG_UNIT_TESTS 00:07:23.962 #undef SPDK_CONFIG_URING 00:07:23.962 #define SPDK_CONFIG_URING_PATH 00:07:23.962 #undef SPDK_CONFIG_URING_ZNS 00:07:23.962 #undef SPDK_CONFIG_USDT 00:07:23.962 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:07:23.962 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:07:23.962 #define SPDK_CONFIG_VFIO_USER 1 00:07:23.962 #define SPDK_CONFIG_VFIO_USER_DIR 00:07:23.962 #define SPDK_CONFIG_VHOST 1 00:07:23.962 #define SPDK_CONFIG_VIRTIO 1 00:07:23.962 #undef SPDK_CONFIG_VTUNE 00:07:23.962 #define SPDK_CONFIG_VTUNE_DIR 00:07:23.962 #define SPDK_CONFIG_WERROR 1 00:07:23.962 #define SPDK_CONFIG_WPDK_DIR 00:07:23.962 #undef SPDK_CONFIG_XNVME 00:07:23.962 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:07:23.963 04:50:58 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:07:23.963 04:50:58 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:07:23.963 04:50:58 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:23.963 04:50:58 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:23.963 04:50:58 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:23.963 04:50:58 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:23.963 04:50:58 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:23.963 04:50:58 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:23.963 04:50:58 -- paths/export.sh@5 -- # export PATH 00:07:23.963 04:50:58 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:23.963 04:50:58 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:23.963 04:50:58 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:07:23.963 04:50:58 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:23.963 04:50:58 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:07:23.963 04:50:58 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:07:23.963 04:50:58 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:07:23.963 04:50:58 -- pm/common@16 -- # TEST_TAG=N/A 00:07:23.963 04:50:58 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:07:23.963 04:50:58 -- common/autotest_common.sh@52 -- # : 1 00:07:23.963 04:50:58 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:07:23.963 04:50:58 -- common/autotest_common.sh@56 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:07:23.963 04:50:58 -- common/autotest_common.sh@58 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:07:23.963 04:50:58 -- common/autotest_common.sh@60 -- # : 1 00:07:23.963 04:50:58 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:07:23.963 04:50:58 -- common/autotest_common.sh@62 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:07:23.963 04:50:58 -- common/autotest_common.sh@64 -- # : 00:07:23.963 04:50:58 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:07:23.963 04:50:58 -- common/autotest_common.sh@66 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:07:23.963 04:50:58 -- common/autotest_common.sh@68 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:07:23.963 04:50:58 -- common/autotest_common.sh@70 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:07:23.963 04:50:58 -- common/autotest_common.sh@72 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:07:23.963 04:50:58 -- common/autotest_common.sh@74 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:07:23.963 04:50:58 -- common/autotest_common.sh@76 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:07:23.963 04:50:58 -- common/autotest_common.sh@78 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:07:23.963 04:50:58 -- common/autotest_common.sh@80 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:07:23.963 04:50:58 -- common/autotest_common.sh@82 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:07:23.963 04:50:58 -- common/autotest_common.sh@84 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:07:23.963 04:50:58 -- common/autotest_common.sh@86 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:07:23.963 04:50:58 -- common/autotest_common.sh@88 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:07:23.963 04:50:58 -- common/autotest_common.sh@90 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:07:23.963 04:50:58 -- common/autotest_common.sh@92 -- # : 1 00:07:23.963 04:50:58 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:07:23.963 04:50:58 -- common/autotest_common.sh@94 -- # : 1 00:07:23.963 04:50:58 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:07:23.963 04:50:58 -- common/autotest_common.sh@96 -- # : rdma 00:07:23.963 04:50:58 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:07:23.963 04:50:58 -- common/autotest_common.sh@98 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:07:23.963 04:50:58 -- common/autotest_common.sh@100 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:07:23.963 04:50:58 -- common/autotest_common.sh@102 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:07:23.963 04:50:58 -- common/autotest_common.sh@104 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:07:23.963 04:50:58 -- common/autotest_common.sh@106 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:07:23.963 04:50:58 -- common/autotest_common.sh@108 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:07:23.963 04:50:58 -- common/autotest_common.sh@110 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:07:23.963 04:50:58 -- common/autotest_common.sh@112 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:07:23.963 04:50:58 -- common/autotest_common.sh@114 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:07:23.963 04:50:58 -- common/autotest_common.sh@116 -- # : 1 00:07:23.963 04:50:58 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:07:23.963 04:50:58 -- common/autotest_common.sh@118 -- # : 00:07:23.963 04:50:58 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:07:23.963 04:50:58 -- common/autotest_common.sh@120 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:07:23.963 04:50:58 -- common/autotest_common.sh@122 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:07:23.963 04:50:58 -- common/autotest_common.sh@124 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:07:23.963 04:50:58 -- common/autotest_common.sh@126 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:07:23.963 04:50:58 -- common/autotest_common.sh@128 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:07:23.963 04:50:58 -- common/autotest_common.sh@130 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:07:23.963 04:50:58 -- common/autotest_common.sh@132 -- # : 00:07:23.963 04:50:58 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:07:23.963 04:50:58 -- common/autotest_common.sh@134 -- # : true 00:07:23.963 04:50:58 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:07:23.963 04:50:58 -- common/autotest_common.sh@136 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:07:23.963 04:50:58 -- common/autotest_common.sh@138 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:07:23.963 04:50:58 -- common/autotest_common.sh@140 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:07:23.963 04:50:58 -- common/autotest_common.sh@142 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:07:23.963 04:50:58 -- common/autotest_common.sh@144 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:07:23.963 04:50:58 -- common/autotest_common.sh@146 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:07:23.963 04:50:58 -- common/autotest_common.sh@148 -- # : 00:07:23.963 04:50:58 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:07:23.963 04:50:58 -- common/autotest_common.sh@150 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:07:23.963 04:50:58 -- common/autotest_common.sh@152 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:07:23.963 04:50:58 -- common/autotest_common.sh@154 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:07:23.963 04:50:58 -- common/autotest_common.sh@156 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:07:23.963 04:50:58 -- common/autotest_common.sh@158 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:07:23.963 04:50:58 -- common/autotest_common.sh@160 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:07:23.963 04:50:58 -- common/autotest_common.sh@163 -- # : 00:07:23.963 04:50:58 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:07:23.963 04:50:58 -- common/autotest_common.sh@165 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:07:23.963 04:50:58 -- common/autotest_common.sh@167 -- # : 0 00:07:23.963 04:50:58 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:07:23.963 04:50:58 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:07:23.964 04:50:58 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:23.964 04:50:58 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:07:23.964 04:50:58 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:07:23.964 04:50:58 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:07:23.964 04:50:58 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:23.964 04:50:58 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:07:23.964 04:50:58 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:23.964 04:50:58 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:07:23.964 04:50:58 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:07:23.964 04:50:58 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:07:23.964 04:50:58 -- common/autotest_common.sh@196 -- # cat 00:07:23.964 04:50:58 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:07:23.964 04:50:58 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:23.964 04:50:58 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:07:23.964 04:50:58 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:23.964 04:50:58 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:07:23.964 04:50:58 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:07:23.964 04:50:58 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:07:23.964 04:50:58 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:23.964 04:50:58 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:07:23.964 04:50:58 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:23.964 04:50:58 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:07:23.964 04:50:58 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:23.964 04:50:58 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:07:23.964 04:50:58 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:23.964 04:50:58 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:07:23.964 04:50:58 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:23.964 04:50:58 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:07:23.964 04:50:58 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:07:23.964 04:50:58 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:07:23.964 04:50:58 -- common/autotest_common.sh@249 -- # _LCOV= 00:07:23.964 04:50:58 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:07:23.964 04:50:58 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:07:23.964 04:50:58 -- common/autotest_common.sh@250 -- # _LCOV=1 00:07:23.964 04:50:58 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:23.964 04:50:58 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:07:23.964 04:50:58 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:07:23.964 04:50:58 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:07:23.964 04:50:58 -- common/autotest_common.sh@259 -- # export valgrind= 00:07:23.964 04:50:58 -- common/autotest_common.sh@259 -- # valgrind= 00:07:23.964 04:50:58 -- common/autotest_common.sh@265 -- # uname -s 00:07:23.964 04:50:58 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:07:23.964 04:50:58 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:07:23.964 04:50:58 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:07:23.964 04:50:58 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:23.964 04:50:58 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:07:23.964 04:50:59 -- common/autotest_common.sh@275 -- # MAKE=make 00:07:23.964 04:50:59 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:07:23.964 04:50:59 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:07:23.964 04:50:59 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:07:23.964 04:50:59 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:07:23.964 04:50:59 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:07:23.964 04:50:59 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:07:23.964 04:50:59 -- common/autotest_common.sh@319 -- # [[ -z 3677955 ]] 00:07:23.964 04:50:59 -- common/autotest_common.sh@319 -- # kill -0 3677955 00:07:23.964 04:50:59 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:07:23.964 04:50:59 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:07:23.964 04:50:59 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:07:23.964 04:50:59 -- common/autotest_common.sh@332 -- # local mount target_dir 00:07:23.964 04:50:59 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:07:23.964 04:50:59 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:07:23.964 04:50:59 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:07:23.964 04:50:59 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:07:23.964 04:50:59 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.DPZVnz 00:07:23.964 04:50:59 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:07:23.964 04:50:59 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:07:23.964 04:50:59 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:07:23.964 04:50:59 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf /tmp/spdk.DPZVnz/tests/nvmf /tmp/spdk.DPZVnz 00:07:23.964 04:50:59 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:07:23.964 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.964 04:50:59 -- common/autotest_common.sh@328 -- # df -T 00:07:23.964 04:50:59 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:07:23.964 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:07:23.964 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:07:23.964 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:07:23.964 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=54021926912 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730578432 00:07:23.964 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=7708651520 00:07:23.964 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:23.964 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862696448 00:07:23.964 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:23.965 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:07:23.965 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:07:23.965 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:07:23.965 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864465920 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:07:23.965 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=823296 00:07:23.965 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:07:23.965 04:50:59 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:07:23.965 04:50:59 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:07:23.965 04:50:59 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:07:23.965 04:50:59 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:07:23.965 * Looking for test storage... 00:07:23.965 04:50:59 -- common/autotest_common.sh@369 -- # local target_space new_size 00:07:23.965 04:50:59 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:07:23.965 04:50:59 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:07:23.965 04:50:59 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.965 04:50:59 -- common/autotest_common.sh@373 -- # mount=/ 00:07:23.965 04:50:59 -- common/autotest_common.sh@375 -- # target_space=54021926912 00:07:23.965 04:50:59 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:07:23.965 04:50:59 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:07:23.965 04:50:59 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@382 -- # new_size=9923244032 00:07:23.965 04:50:59 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:07:23.965 04:50:59 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.965 04:50:59 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.965 04:50:59 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.965 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf 00:07:23.965 04:50:59 -- common/autotest_common.sh@390 -- # return 0 00:07:23.965 04:50:59 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:07:23.965 04:50:59 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:07:23.965 04:50:59 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:07:23.965 04:50:59 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:07:23.965 04:50:59 -- common/autotest_common.sh@1682 -- # true 00:07:23.965 04:50:59 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:07:23.965 04:50:59 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@27 -- # exec 00:07:23.965 04:50:59 -- common/autotest_common.sh@29 -- # exec 00:07:23.965 04:50:59 -- common/autotest_common.sh@31 -- # xtrace_restore 00:07:23.965 04:50:59 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:07:23.965 04:50:59 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:07:23.965 04:50:59 -- common/autotest_common.sh@18 -- # set -x 00:07:23.965 04:50:59 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:07:23.965 04:50:59 -- common/autotest_common.sh@1690 -- # lcov --version 00:07:23.965 04:50:59 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:07:24.224 04:50:59 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:07:24.224 04:50:59 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:07:24.224 04:50:59 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:07:24.224 04:50:59 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:07:24.224 04:50:59 -- scripts/common.sh@335 -- # IFS=.-: 00:07:24.224 04:50:59 -- scripts/common.sh@335 -- # read -ra ver1 00:07:24.224 04:50:59 -- scripts/common.sh@336 -- # IFS=.-: 00:07:24.224 04:50:59 -- scripts/common.sh@336 -- # read -ra ver2 00:07:24.224 04:50:59 -- scripts/common.sh@337 -- # local 'op=<' 00:07:24.224 04:50:59 -- scripts/common.sh@339 -- # ver1_l=2 00:07:24.224 04:50:59 -- scripts/common.sh@340 -- # ver2_l=1 00:07:24.224 04:50:59 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:07:24.224 04:50:59 -- scripts/common.sh@343 -- # case "$op" in 00:07:24.224 04:50:59 -- scripts/common.sh@344 -- # : 1 00:07:24.224 04:50:59 -- scripts/common.sh@363 -- # (( v = 0 )) 00:07:24.224 04:50:59 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:24.224 04:50:59 -- scripts/common.sh@364 -- # decimal 1 00:07:24.224 04:50:59 -- scripts/common.sh@352 -- # local d=1 00:07:24.224 04:50:59 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:24.224 04:50:59 -- scripts/common.sh@354 -- # echo 1 00:07:24.224 04:50:59 -- scripts/common.sh@364 -- # ver1[v]=1 00:07:24.224 04:50:59 -- scripts/common.sh@365 -- # decimal 2 00:07:24.224 04:50:59 -- scripts/common.sh@352 -- # local d=2 00:07:24.224 04:50:59 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:24.224 04:50:59 -- scripts/common.sh@354 -- # echo 2 00:07:24.224 04:50:59 -- scripts/common.sh@365 -- # ver2[v]=2 00:07:24.224 04:50:59 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:07:24.224 04:50:59 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:07:24.224 04:50:59 -- scripts/common.sh@367 -- # return 0 00:07:24.224 04:50:59 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:24.224 04:50:59 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:07:24.224 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.224 --rc genhtml_branch_coverage=1 00:07:24.224 --rc genhtml_function_coverage=1 00:07:24.224 --rc genhtml_legend=1 00:07:24.224 --rc geninfo_all_blocks=1 00:07:24.224 --rc geninfo_unexecuted_blocks=1 00:07:24.224 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.224 ' 00:07:24.224 04:50:59 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:07:24.224 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.224 --rc genhtml_branch_coverage=1 00:07:24.224 --rc genhtml_function_coverage=1 00:07:24.224 --rc genhtml_legend=1 00:07:24.224 --rc geninfo_all_blocks=1 00:07:24.224 --rc geninfo_unexecuted_blocks=1 00:07:24.224 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.224 ' 00:07:24.224 04:50:59 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:07:24.224 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.224 --rc genhtml_branch_coverage=1 00:07:24.224 --rc genhtml_function_coverage=1 00:07:24.224 --rc genhtml_legend=1 00:07:24.224 --rc geninfo_all_blocks=1 00:07:24.224 --rc geninfo_unexecuted_blocks=1 00:07:24.224 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.224 ' 00:07:24.224 04:50:59 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:07:24.224 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:24.224 --rc genhtml_branch_coverage=1 00:07:24.224 --rc genhtml_function_coverage=1 00:07:24.224 --rc genhtml_legend=1 00:07:24.224 --rc geninfo_all_blocks=1 00:07:24.224 --rc geninfo_unexecuted_blocks=1 00:07:24.224 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:07:24.224 ' 00:07:24.224 04:50:59 -- nvmf/run.sh@53 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/../common.sh 00:07:24.224 04:50:59 -- ../common.sh@8 -- # pids=() 00:07:24.224 04:50:59 -- nvmf/run.sh@55 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:24.224 04:50:59 -- nvmf/run.sh@56 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c 00:07:24.224 04:50:59 -- nvmf/run.sh@56 -- # fuzz_num=25 00:07:24.224 04:50:59 -- nvmf/run.sh@57 -- # (( fuzz_num != 0 )) 00:07:24.224 04:50:59 -- nvmf/run.sh@59 -- # trap 'cleanup /tmp/llvm_fuzz*; exit 1' SIGINT SIGTERM EXIT 00:07:24.224 04:50:59 -- nvmf/run.sh@61 -- # mem_size=512 00:07:24.224 04:50:59 -- nvmf/run.sh@62 -- # [[ 1 -eq 1 ]] 00:07:24.224 04:50:59 -- nvmf/run.sh@63 -- # start_llvm_fuzz_short 25 1 00:07:24.224 04:50:59 -- ../common.sh@69 -- # local fuzz_num=25 00:07:24.224 04:50:59 -- ../common.sh@70 -- # local time=1 00:07:24.224 04:50:59 -- ../common.sh@72 -- # (( i = 0 )) 00:07:24.224 04:50:59 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:24.224 04:50:59 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:07:24.224 04:50:59 -- nvmf/run.sh@23 -- # local fuzzer_type=0 00:07:24.224 04:50:59 -- nvmf/run.sh@24 -- # local timen=1 00:07:24.224 04:50:59 -- nvmf/run.sh@25 -- # local core=0x1 00:07:24.224 04:50:59 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:24.224 04:50:59 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_0.conf 00:07:24.224 04:50:59 -- nvmf/run.sh@29 -- # printf %02d 0 00:07:24.224 04:50:59 -- nvmf/run.sh@29 -- # port=4400 00:07:24.224 04:50:59 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:24.224 04:50:59 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' 00:07:24.224 04:50:59 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4400"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:24.224 04:50:59 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4400' -c /tmp/fuzz_json_0.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 -Z 0 -r /var/tmp/spdk0.sock 00:07:24.224 [2024-11-08 04:50:59.203474] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:24.224 [2024-11-08 04:50:59.203562] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3678069 ] 00:07:24.224 EAL: No free 2048 kB hugepages reported on node 1 00:07:24.483 [2024-11-08 04:50:59.466424] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:24.483 [2024-11-08 04:50:59.551260] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:24.483 [2024-11-08 04:50:59.551388] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.740 [2024-11-08 04:50:59.609613] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:24.740 [2024-11-08 04:50:59.625962] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4400 *** 00:07:24.740 INFO: Running with entropic power schedule (0xFF, 100). 00:07:24.740 INFO: Seed: 214750599 00:07:24.740 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:24.740 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:24.740 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_0 00:07:24.740 INFO: A corpus is not provided, starting from an empty corpus 00:07:24.740 #2 INITED exec/s: 0 rss: 60Mb 00:07:24.740 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:24.740 This may also happen if the target rejected all inputs we tried so far 00:07:24.740 [2024-11-08 04:50:59.674923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.740 [2024-11-08 04:50:59.674952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.998 NEW_FUNC[1/669]: 0x43a858 in fuzz_admin_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:47 00:07:24.998 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:24.998 #3 NEW cov: 11537 ft: 11542 corp: 2/110b lim: 320 exec/s: 0 rss: 68Mb L: 109/109 MS: 1 InsertRepeatedBytes- 00:07:24.998 [2024-11-08 04:50:59.995703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.998 [2024-11-08 04:50:59.995735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.998 NEW_FUNC[1/1]: 0x1c5f608 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1312 00:07:24.998 #4 NEW cov: 11654 ft: 12058 corp: 3/220b lim: 320 exec/s: 0 rss: 69Mb L: 110/110 MS: 1 InsertByte- 00:07:24.998 [2024-11-08 04:51:00.045814] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:24.998 [2024-11-08 04:51:00.045849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.998 #5 NEW cov: 11660 ft: 12404 corp: 4/304b lim: 320 exec/s: 0 rss: 69Mb L: 84/110 MS: 1 EraseBytes- 00:07:24.998 [2024-11-08 04:51:00.085896] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1ad2e48f5c9682ff 00:07:24.998 [2024-11-08 04:51:00.085927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:24.998 #11 NEW cov: 11745 ft: 12625 corp: 5/422b lim: 320 exec/s: 0 rss: 69Mb L: 118/118 MS: 1 CMP- DE: "\377\202\226\\\217\344\322\032"- 00:07:25.256 [2024-11-08 04:51:00.126044] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.256 [2024-11-08 04:51:00.126074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 #12 NEW cov: 11745 ft: 12698 corp: 6/532b lim: 320 exec/s: 0 rss: 69Mb L: 110/118 MS: 1 ChangeBit- 00:07:25.256 [2024-11-08 04:51:00.166151] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1ad2e48f5c9682ff 00:07:25.256 [2024-11-08 04:51:00.166178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 #13 NEW cov: 11745 ft: 12861 corp: 7/650b lim: 320 exec/s: 0 rss: 69Mb L: 118/118 MS: 1 ShuffleBytes- 00:07:25.256 [2024-11-08 04:51:00.206419] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:25.256 [2024-11-08 04:51:00.206445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 [2024-11-08 04:51:00.206505] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.256 [2024-11-08 04:51:00.206520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.256 NEW_FUNC[1/1]: 0x16c4058 in nvme_get_sgl_unkeyed /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/nvme_qpair.c:143 00:07:25.256 #14 NEW cov: 11780 ft: 13517 corp: 8/809b lim: 320 exec/s: 0 rss: 69Mb L: 159/159 MS: 1 InsertRepeatedBytes- 00:07:25.256 [2024-11-08 04:51:00.246376] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.256 [2024-11-08 04:51:00.246403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 #15 NEW cov: 11780 ft: 13536 corp: 9/895b lim: 320 exec/s: 0 rss: 69Mb L: 86/159 MS: 1 CMP- DE: "\001\014"- 00:07:25.256 [2024-11-08 04:51:00.286498] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.256 [2024-11-08 04:51:00.286529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 #16 NEW cov: 11780 ft: 13612 corp: 10/1005b lim: 320 exec/s: 0 rss: 69Mb L: 110/159 MS: 1 ChangeBinInt- 00:07:25.256 [2024-11-08 04:51:00.326586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.256 [2024-11-08 04:51:00.326612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.256 #20 NEW cov: 11782 ft: 13732 corp: 11/1075b lim: 320 exec/s: 0 rss: 69Mb L: 70/159 MS: 4 InsertByte-EraseBytes-ChangeByte-InsertRepeatedBytes- 00:07:25.514 [2024-11-08 04:51:00.366703] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfdfdfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.514 [2024-11-08 04:51:00.366733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.514 #21 NEW cov: 11782 ft: 13795 corp: 12/1194b lim: 320 exec/s: 0 rss: 69Mb L: 119/159 MS: 1 InsertRepeatedBytes- 00:07:25.514 [2024-11-08 04:51:00.406947] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.514 [2024-11-08 04:51:00.406973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.514 [2024-11-08 04:51:00.407033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (df) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.514 [2024-11-08 04:51:00.407047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.514 NEW_FUNC[1/1]: 0x12c5228 in nvmf_tcp_req_set_cpl /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/tcp.c:2016 00:07:25.514 #22 NEW cov: 11813 ft: 13855 corp: 13/1357b lim: 320 exec/s: 0 rss: 70Mb L: 163/163 MS: 1 CrossOver- 00:07:25.514 [2024-11-08 04:51:00.457007] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.515 [2024-11-08 04:51:00.457034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.515 #23 NEW cov: 11813 ft: 13893 corp: 14/1433b lim: 320 exec/s: 0 rss: 70Mb L: 76/163 MS: 1 EraseBytes- 00:07:25.515 [2024-11-08 04:51:00.497106] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x1ad2e48f5c9682ff 00:07:25.515 [2024-11-08 04:51:00.497132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.515 #24 NEW cov: 11813 ft: 13911 corp: 15/1555b lim: 320 exec/s: 0 rss: 70Mb L: 122/163 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:25.515 [2024-11-08 04:51:00.537425] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:25.515 [2024-11-08 04:51:00.537451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.515 [2024-11-08 04:51:00.537510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.515 [2024-11-08 04:51:00.537529] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.515 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:25.515 #25 NEW cov: 11836 ft: 13924 corp: 16/1714b lim: 320 exec/s: 0 rss: 70Mb L: 159/163 MS: 1 ChangeBinInt- 00:07:25.515 [2024-11-08 04:51:00.587623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.515 [2024-11-08 04:51:00.587650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.515 [2024-11-08 04:51:00.587703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.515 [2024-11-08 04:51:00.587716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.515 [2024-11-08 04:51:00.587769] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.515 [2024-11-08 04:51:00.587782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.515 #26 NEW cov: 11836 ft: 14092 corp: 17/1918b lim: 320 exec/s: 0 rss: 70Mb L: 204/204 MS: 1 CrossOver- 00:07:25.773 [2024-11-08 04:51:00.627489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.773 [2024-11-08 04:51:00.627519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.773 #27 NEW cov: 11836 ft: 14123 corp: 18/1996b lim: 320 exec/s: 0 rss: 70Mb L: 78/204 MS: 1 CMP- DE: "\011\016*JW\226\203\000"- 00:07:25.773 [2024-11-08 04:51:00.667635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfdfdfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.773 [2024-11-08 04:51:00.667661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.773 #28 NEW cov: 11836 ft: 14140 corp: 19/2115b lim: 320 exec/s: 28 rss: 70Mb L: 119/204 MS: 1 ChangeBit- 00:07:25.773 [2024-11-08 04:51:00.707970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.773 [2024-11-08 04:51:00.707997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.773 [2024-11-08 04:51:00.708049] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf 00:07:25.773 [2024-11-08 04:51:00.708063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.773 [2024-11-08 04:51:00.708121] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (df) qid:0 cid:6 nsid:dfdfdf cdw10:00000000 cdw11:00002a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000000 00:07:25.773 [2024-11-08 04:51:00.708136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.773 #29 NEW cov: 11836 ft: 14166 corp: 20/2331b lim: 320 exec/s: 29 rss: 70Mb L: 216/216 MS: 1 CrossOver- 00:07:25.773 [2024-11-08 04:51:00.747835] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:25.773 [2024-11-08 04:51:00.747861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.773 #30 NEW cov: 11836 ft: 14215 corp: 21/2419b lim: 320 exec/s: 30 rss: 70Mb L: 88/216 MS: 1 EraseBytes- 00:07:25.773 [2024-11-08 04:51:00.788311] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.773 [2024-11-08 04:51:00.788337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.774 [2024-11-08 04:51:00.788389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:dfdfdfdf cdw11:dfdfdfdf 00:07:25.774 [2024-11-08 04:51:00.788403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.774 [2024-11-08 04:51:00.788460] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (df) qid:0 cid:6 nsid:dfdfdf cdw10:00000000 cdw11:00002a00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x10000000000 00:07:25.774 [2024-11-08 04:51:00.788475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:25.774 [2024-11-08 04:51:00.788531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:25.774 [2024-11-08 04:51:00.788545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:25.774 #31 NEW cov: 11836 ft: 14364 corp: 22/2695b lim: 320 exec/s: 31 rss: 70Mb L: 276/276 MS: 1 InsertRepeatedBytes- 00:07:25.774 [2024-11-08 04:51:00.828251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:25.774 [2024-11-08 04:51:00.828278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:25.774 [2024-11-08 04:51:00.828337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:25.774 [2024-11-08 04:51:00.828354] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:25.774 #32 NEW cov: 11836 ft: 14374 corp: 23/2854b lim: 320 exec/s: 32 rss: 70Mb L: 159/276 MS: 1 ChangeBinInt- 00:07:25.774 [2024-11-08 04:51:00.868203] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:25.774 [2024-11-08 04:51:00.868228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.032 #33 NEW cov: 11836 ft: 14394 corp: 24/2963b lim: 320 exec/s: 33 rss: 70Mb L: 109/276 MS: 1 ShuffleBytes- 00:07:26.032 [2024-11-08 04:51:00.908301] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.032 [2024-11-08 04:51:00.908327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.032 #34 NEW cov: 11836 ft: 14399 corp: 25/3073b lim: 320 exec/s: 34 rss: 70Mb L: 110/276 MS: 1 InsertByte- 00:07:26.032 [2024-11-08 04:51:00.938471] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77777777 00:07:26.032 [2024-11-08 04:51:00.938497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.032 [2024-11-08 04:51:00.938561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (77) qid:0 cid:5 nsid:77777777 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7777777777777777 00:07:26.032 [2024-11-08 04:51:00.938575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.032 #39 NEW cov: 11836 ft: 14418 corp: 26/3220b lim: 320 exec/s: 39 rss: 70Mb L: 147/276 MS: 5 EraseBytes-InsertByte-ShuffleBytes-EraseBytes-InsertRepeatedBytes- 00:07:26.032 [2024-11-08 04:51:00.978484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xe48f5c9682ff0000 00:07:26.032 [2024-11-08 04:51:00.978510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.033 #40 NEW cov: 11836 ft: 14438 corp: 27/3330b lim: 320 exec/s: 40 rss: 70Mb L: 110/276 MS: 1 PersAutoDict- DE: "\377\202\226\\\217\344\322\032"- 00:07:26.033 [2024-11-08 04:51:01.018761] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:26.033 [2024-11-08 04:51:01.018788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.033 [2024-11-08 04:51:01.018849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.033 [2024-11-08 04:51:01.018862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.033 #41 NEW cov: 11836 ft: 14445 corp: 28/3490b lim: 320 exec/s: 41 rss: 70Mb L: 160/276 MS: 1 InsertByte- 00:07:26.033 [2024-11-08 04:51:01.058716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.033 [2024-11-08 04:51:01.058742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.033 #42 NEW cov: 11836 ft: 14489 corp: 29/3601b lim: 320 exec/s: 42 rss: 70Mb L: 111/276 MS: 1 InsertByte- 00:07:26.033 [2024-11-08 04:51:01.088948] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:26.033 [2024-11-08 04:51:01.088975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.033 [2024-11-08 04:51:01.089040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.033 [2024-11-08 04:51:01.089054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.033 #43 NEW cov: 11836 ft: 14496 corp: 30/3762b lim: 320 exec/s: 43 rss: 70Mb L: 161/276 MS: 1 InsertByte- 00:07:26.033 [2024-11-08 04:51:01.129000] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77777777 00:07:26.033 [2024-11-08 04:51:01.129025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.033 [2024-11-08 04:51:01.129082] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (77) qid:0 cid:5 nsid:8f5c9682 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7777777777777777 00:07:26.033 [2024-11-08 04:51:01.129097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.291 #44 NEW cov: 11836 ft: 14515 corp: 31/3917b lim: 320 exec/s: 44 rss: 70Mb L: 155/276 MS: 1 PersAutoDict- DE: "\377\202\226\\\217\344\322\032"- 00:07:26.291 [2024-11-08 04:51:01.169115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:77777777 cdw11:77777777 00:07:26.291 [2024-11-08 04:51:01.169141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 [2024-11-08 04:51:01.169201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (77) qid:0 cid:5 nsid:8f5c9682 cdw10:77777777 cdw11:77777777 SGL TRANSPORT DATA BLOCK TRANSPORT 0x7777777777777777 00:07:26.291 [2024-11-08 04:51:01.169216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.291 #45 NEW cov: 11836 ft: 14524 corp: 32/4073b lim: 320 exec/s: 45 rss: 70Mb L: 156/276 MS: 1 InsertByte- 00:07:26.291 [2024-11-08 04:51:01.209180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x900000000000000 00:07:26.291 [2024-11-08 04:51:01.209205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 #46 NEW cov: 11836 ft: 14571 corp: 33/4184b lim: 320 exec/s: 46 rss: 70Mb L: 111/276 MS: 1 PersAutoDict- DE: "\011\016*JW\226\203\000"- 00:07:26.291 [2024-11-08 04:51:01.249390] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfdfdfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.291 [2024-11-08 04:51:01.249415] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 [2024-11-08 04:51:01.249477] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:5 nsid:2a2a2a2a cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xdfdfdfdfdfdfdf 00:07:26.291 [2024-11-08 04:51:01.249490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.291 #47 NEW cov: 11836 ft: 14588 corp: 34/4328b lim: 320 exec/s: 47 rss: 70Mb L: 144/276 MS: 1 InsertRepeatedBytes- 00:07:26.291 [2024-11-08 04:51:01.289398] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xd2e48f5c9682ff60 00:07:26.291 [2024-11-08 04:51:01.289423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 #48 NEW cov: 11836 ft: 14698 corp: 35/4451b lim: 320 exec/s: 48 rss: 70Mb L: 123/276 MS: 1 InsertByte- 00:07:26.291 [2024-11-08 04:51:01.329746] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.291 [2024-11-08 04:51:01.329771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 [2024-11-08 04:51:01.329828] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:26.291 [2024-11-08 04:51:01.329842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.291 [2024-11-08 04:51:01.329894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:26.291 [2024-11-08 04:51:01.329908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:26.291 #49 NEW cov: 11836 ft: 14723 corp: 36/4686b lim: 320 exec/s: 49 rss: 70Mb L: 235/276 MS: 1 InsertRepeatedBytes- 00:07:26.291 [2024-11-08 04:51:01.369598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.291 [2024-11-08 04:51:01.369623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.291 #50 NEW cov: 11836 ft: 14731 corp: 37/4795b lim: 320 exec/s: 50 rss: 70Mb L: 109/276 MS: 1 PersAutoDict- DE: "\377\202\226\\\217\344\322\032"- 00:07:26.550 [2024-11-08 04:51:01.409909] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:dfdfdfdf SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.409936] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 [2024-11-08 04:51:01.409998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (2a) qid:0 cid:5 nsid:2a2a2a2a cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.410013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.550 #51 NEW cov: 11836 ft: 14760 corp: 38/4939b lim: 320 exec/s: 51 rss: 70Mb L: 144/276 MS: 1 CrossOver- 00:07:26.550 [2024-11-08 04:51:01.450028] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:c9c9c9c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0xc9c9c9c9c9c9c9c9 00:07:26.550 [2024-11-08 04:51:01.450054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 [2024-11-08 04:51:01.450116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (c9) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:26.550 [2024-11-08 04:51:01.450131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.550 #52 NEW cov: 11836 ft: 14804 corp: 39/5100b lim: 320 exec/s: 52 rss: 70Mb L: 161/276 MS: 1 ShuffleBytes- 00:07:26.550 [2024-11-08 04:51:01.489977] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.490004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 #53 NEW cov: 11836 ft: 14834 corp: 40/5211b lim: 320 exec/s: 53 rss: 70Mb L: 111/276 MS: 1 ChangeByte- 00:07:26.550 [2024-11-08 04:51:01.530177] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0xa000000 00:07:26.550 [2024-11-08 04:51:01.530202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 [2024-11-08 04:51:01.530254] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:26.550 [2024-11-08 04:51:01.530268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.550 #57 NEW cov: 11836 ft: 14868 corp: 41/5376b lim: 320 exec/s: 57 rss: 70Mb L: 165/276 MS: 4 EraseBytes-CrossOver-CrossOver-CrossOver- 00:07:26.550 [2024-11-08 04:51:01.570293] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.570322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 [2024-11-08 04:51:01.570381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ADMIN COMMAND (df) qid:0 cid:5 nsid:0 cdw10:002a0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.570395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:26.550 #58 NEW cov: 11836 ft: 14896 corp: 42/5539b lim: 320 exec/s: 58 rss: 70Mb L: 163/276 MS: 1 ChangeBinInt- 00:07:26.550 [2024-11-08 04:51:01.610340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.610365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.550 #59 NEW cov: 11836 ft: 14920 corp: 43/5652b lim: 320 exec/s: 59 rss: 70Mb L: 113/276 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:26.550 [2024-11-08 04:51:01.650427] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:26.550 [2024-11-08 04:51:01.650453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:26.809 #60 NEW cov: 11836 ft: 14924 corp: 44/5763b lim: 320 exec/s: 30 rss: 70Mb L: 111/276 MS: 1 InsertByte- 00:07:26.809 #60 DONE cov: 11836 ft: 14924 corp: 44/5763b lim: 320 exec/s: 30 rss: 70Mb 00:07:26.809 ###### Recommended dictionary. ###### 00:07:26.809 "\377\202\226\\\217\344\322\032" # Uses: 3 00:07:26.809 "\001\014" # Uses: 0 00:07:26.809 "\001\000\000\000" # Uses: 1 00:07:26.809 "\011\016*JW\226\203\000" # Uses: 1 00:07:26.809 ###### End of recommended dictionary. ###### 00:07:26.809 Done 60 runs in 2 second(s) 00:07:26.809 04:51:01 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_0.conf 00:07:26.809 04:51:01 -- ../common.sh@72 -- # (( i++ )) 00:07:26.809 04:51:01 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:26.809 04:51:01 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:07:26.809 04:51:01 -- nvmf/run.sh@23 -- # local fuzzer_type=1 00:07:26.809 04:51:01 -- nvmf/run.sh@24 -- # local timen=1 00:07:26.809 04:51:01 -- nvmf/run.sh@25 -- # local core=0x1 00:07:26.809 04:51:01 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:26.809 04:51:01 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_1.conf 00:07:26.809 04:51:01 -- nvmf/run.sh@29 -- # printf %02d 1 00:07:26.809 04:51:01 -- nvmf/run.sh@29 -- # port=4401 00:07:26.809 04:51:01 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:26.809 04:51:01 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' 00:07:26.809 04:51:01 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4401"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:26.809 04:51:01 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4401' -c /tmp/fuzz_json_1.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 -Z 1 -r /var/tmp/spdk1.sock 00:07:26.809 [2024-11-08 04:51:01.844116] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:26.810 [2024-11-08 04:51:01.844205] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3678684 ] 00:07:26.810 EAL: No free 2048 kB hugepages reported on node 1 00:07:27.068 [2024-11-08 04:51:02.094295] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.326 [2024-11-08 04:51:02.182741] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:27.326 [2024-11-08 04:51:02.182866] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:27.326 [2024-11-08 04:51:02.241099] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:27.326 [2024-11-08 04:51:02.257432] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4401 *** 00:07:27.326 INFO: Running with entropic power schedule (0xFF, 100). 00:07:27.326 INFO: Seed: 2847757882 00:07:27.326 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:27.326 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:27.326 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_1 00:07:27.326 INFO: A corpus is not provided, starting from an empty corpus 00:07:27.326 #2 INITED exec/s: 0 rss: 60Mb 00:07:27.326 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:27.326 This may also happen if the target rejected all inputs we tried so far 00:07:27.326 [2024-11-08 04:51:02.333684] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:27.326 [2024-11-08 04:51:02.333931] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:27.327 [2024-11-08 04:51:02.334360] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.327 [2024-11-08 04:51:02.334400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.327 [2024-11-08 04:51:02.334472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.327 [2024-11-08 04:51:02.334488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.585 NEW_FUNC[1/671]: 0x43b158 in fuzz_admin_get_log_page_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:67 00:07:27.585 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:27.585 #3 NEW cov: 11622 ft: 11623 corp: 2/13b lim: 30 exec/s: 0 rss: 68Mb L: 12/12 MS: 1 InsertRepeatedBytes- 00:07:27.585 [2024-11-08 04:51:02.664492] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.585 [2024-11-08 04:51:02.664702] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.585 [2024-11-08 04:51:02.664870] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.585 [2024-11-08 04:51:02.665258] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.585 [2024-11-08 04:51:02.665308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.585 [2024-11-08 04:51:02.665461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.585 [2024-11-08 04:51:02.665485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.585 [2024-11-08 04:51:02.665642] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.585 [2024-11-08 04:51:02.665668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.585 #16 NEW cov: 11735 ft: 12644 corp: 3/32b lim: 30 exec/s: 0 rss: 69Mb L: 19/19 MS: 3 CopyPart-InsertByte-InsertRepeatedBytes- 00:07:27.843 [2024-11-08 04:51:02.714697] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.843 [2024-11-08 04:51:02.714882] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.843 [2024-11-08 04:51:02.715049] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cec6 00:07:27.843 [2024-11-08 04:51:02.715443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.843 [2024-11-08 04:51:02.715479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.843 [2024-11-08 04:51:02.715606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.843 [2024-11-08 04:51:02.715624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.843 [2024-11-08 04:51:02.715755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.843 [2024-11-08 04:51:02.715773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.843 #17 NEW cov: 11741 ft: 12784 corp: 4/51b lim: 30 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 ChangeBit- 00:07:27.843 [2024-11-08 04:51:02.774672] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.843 [2024-11-08 04:51:02.774861] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000ace 00:07:27.843 [2024-11-08 04:51:02.775031] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.843 [2024-11-08 04:51:02.775211] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.843 [2024-11-08 04:51:02.775611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.843 [2024-11-08 04:51:02.775642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.843 [2024-11-08 04:51:02.775780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.843 [2024-11-08 04:51:02.775799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.843 [2024-11-08 04:51:02.775934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.775953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.776074] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.776093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:27.844 #18 NEW cov: 11826 ft: 13523 corp: 5/80b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:27.844 [2024-11-08 04:51:02.824838] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.844 [2024-11-08 04:51:02.825017] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:27.844 [2024-11-08 04:51:02.825187] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cec6 00:07:27.844 [2024-11-08 04:51:02.825551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.825581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.825722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.825740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.825879] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.825904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.844 #19 NEW cov: 11826 ft: 13617 corp: 6/99b lim: 30 exec/s: 0 rss: 69Mb L: 19/29 MS: 1 ChangeASCIIInt- 00:07:27.844 [2024-11-08 04:51:02.885070] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:27.844 [2024-11-08 04:51:02.885413] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:27.844 [2024-11-08 04:51:02.885785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.885814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.885952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.885969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.886097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.886121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:27.844 #20 NEW cov: 11858 ft: 13723 corp: 7/117b lim: 30 exec/s: 0 rss: 69Mb L: 18/29 MS: 1 InsertRepeatedBytes- 00:07:27.844 [2024-11-08 04:51:02.945099] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff2e 00:07:27.844 [2024-11-08 04:51:02.945278] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:27.844 [2024-11-08 04:51:02.945652] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.945690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:27.844 [2024-11-08 04:51:02.945825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:27.844 [2024-11-08 04:51:02.945843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.102 #21 NEW cov: 11858 ft: 13775 corp: 8/129b lim: 30 exec/s: 0 rss: 69Mb L: 12/29 MS: 1 ChangeByte- 00:07:28.102 [2024-11-08 04:51:02.995356] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.102 [2024-11-08 04:51:02.995691] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:28.102 [2024-11-08 04:51:02.996065] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.102 [2024-11-08 04:51:02.996094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.102 [2024-11-08 04:51:02.996231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.102 [2024-11-08 04:51:02.996248] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.102 [2024-11-08 04:51:02.996377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.102 [2024-11-08 04:51:02.996397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.102 #22 NEW cov: 11858 ft: 13908 corp: 9/147b lim: 30 exec/s: 0 rss: 69Mb L: 18/29 MS: 1 ChangeBinInt- 00:07:28.102 [2024-11-08 04:51:03.055571] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.102 [2024-11-08 04:51:03.055761] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.102 [2024-11-08 04:51:03.055929] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000cece 00:07:28.103 [2024-11-08 04:51:03.056338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.056368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.056494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.056512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.056637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.056654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.103 #23 NEW cov: 11858 ft: 13928 corp: 10/170b lim: 30 exec/s: 0 rss: 69Mb L: 23/29 MS: 1 CMP- DE: "\377\377\3777"- 00:07:28.103 [2024-11-08 04:51:03.115805] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ffff 00:07:28.103 [2024-11-08 04:51:03.116126] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:28.103 [2024-11-08 04:51:03.116507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff02ff cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.116539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.116688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.116706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.116835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.116855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.103 #24 NEW cov: 11858 ft: 13958 corp: 11/188b lim: 30 exec/s: 0 rss: 69Mb L: 18/29 MS: 1 ChangeBit- 00:07:28.103 [2024-11-08 04:51:03.166053] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.103 [2024-11-08 04:51:03.166236] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.103 [2024-11-08 04:51:03.166407] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.103 [2024-11-08 04:51:03.166572] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.103 [2024-11-08 04:51:03.166987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.167016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.167148] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.167168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.167304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.167327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.103 [2024-11-08 04:51:03.167458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.103 [2024-11-08 04:51:03.167475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.103 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:28.103 #25 NEW cov: 11881 ft: 13989 corp: 12/217b lim: 30 exec/s: 0 rss: 69Mb L: 29/29 MS: 1 CopyPart- 00:07:28.362 [2024-11-08 04:51:03.226081] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.362 [2024-11-08 04:51:03.226269] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:07:28.362 [2024-11-08 04:51:03.226437] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.362 [2024-11-08 04:51:03.226856] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.226884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.227021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.227037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.227167] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.227186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.362 #26 NEW cov: 11889 ft: 14036 corp: 13/236b lim: 30 exec/s: 0 rss: 69Mb L: 19/29 MS: 1 InsertByte- 00:07:28.362 [2024-11-08 04:51:03.276358] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.362 [2024-11-08 04:51:03.276543] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.362 [2024-11-08 04:51:03.276705] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.362 [2024-11-08 04:51:03.277080] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.277109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.277255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.277272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.277404] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.277423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.362 #27 NEW cov: 11889 ft: 14077 corp: 14/258b lim: 30 exec/s: 0 rss: 69Mb L: 22/29 MS: 1 CopyPart- 00:07:28.362 [2024-11-08 04:51:03.326276] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.362 [2024-11-08 04:51:03.326659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.326690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.362 #28 NEW cov: 11889 ft: 14484 corp: 15/267b lim: 30 exec/s: 28 rss: 70Mb L: 9/29 MS: 1 EraseBytes- 00:07:28.362 [2024-11-08 04:51:03.386714] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:28.362 [2024-11-08 04:51:03.386906] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.362 [2024-11-08 04:51:03.387074] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff2e 00:07:28.362 [2024-11-08 04:51:03.387238] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:28.362 [2024-11-08 04:51:03.387609] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.387637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.387753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.387771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.387910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.387929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.388073] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.388090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.362 #29 NEW cov: 11889 ft: 14505 corp: 16/291b lim: 30 exec/s: 29 rss: 70Mb L: 24/29 MS: 1 CrossOver- 00:07:28.362 [2024-11-08 04:51:03.446860] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.362 [2024-11-08 04:51:03.447188] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xffff 00:07:28.362 [2024-11-08 04:51:03.447588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.447617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.447758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.447779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.362 [2024-11-08 04:51:03.447915] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.362 [2024-11-08 04:51:03.447933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.362 #30 NEW cov: 11889 ft: 14512 corp: 17/309b lim: 30 exec/s: 30 rss: 70Mb L: 18/29 MS: 1 CopyPart- 00:07:28.621 [2024-11-08 04:51:03.496994] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.621 [2024-11-08 04:51:03.497187] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (736060) > buf size (4096) 00:07:28.621 [2024-11-08 04:51:03.497561] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.497591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.497727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.497764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.621 #31 NEW cov: 11889 ft: 14532 corp: 18/321b lim: 30 exec/s: 31 rss: 70Mb L: 12/29 MS: 1 EraseBytes- 00:07:28.621 [2024-11-08 04:51:03.547177] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.621 [2024-11-08 04:51:03.547354] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (261124) > buf size (4096) 00:07:28.621 [2024-11-08 04:51:03.547531] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.621 [2024-11-08 04:51:03.547937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.547966] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.548104] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ff000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.548122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.548259] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ceff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.548277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.621 #32 NEW cov: 11889 ft: 14538 corp: 19/340b lim: 30 exec/s: 32 rss: 70Mb L: 19/29 MS: 1 CrossOver- 00:07:28.621 [2024-11-08 04:51:03.607587] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:28.621 [2024-11-08 04:51:03.607766] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.621 [2024-11-08 04:51:03.607935] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xff2e 00:07:28.621 [2024-11-08 04:51:03.608097] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:28.621 [2024-11-08 04:51:03.608497] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.608530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.608667] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.608687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.608822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.608840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.608978] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.608997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.621 #33 NEW cov: 11889 ft: 14556 corp: 20/364b lim: 30 exec/s: 33 rss: 70Mb L: 24/29 MS: 1 ChangeByte- 00:07:28.621 [2024-11-08 04:51:03.667682] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.621 [2024-11-08 04:51:03.667880] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.621 [2024-11-08 04:51:03.668053] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000ce32 00:07:28.621 [2024-11-08 04:51:03.668467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.668495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.668636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.668653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.621 [2024-11-08 04:51:03.668792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.621 [2024-11-08 04:51:03.668811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.621 #34 NEW cov: 11889 ft: 14570 corp: 21/382b lim: 30 exec/s: 34 rss: 70Mb L: 18/29 MS: 1 EraseBytes- 00:07:28.621 [2024-11-08 04:51:03.727935] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:28.621 [2024-11-08 04:51:03.728276] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:28.621 [2024-11-08 04:51:03.728462] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200007272 00:07:28.621 [2024-11-08 04:51:03.728859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ff0b8300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.622 [2024-11-08 04:51:03.728890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.622 [2024-11-08 04:51:03.729032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.622 [2024-11-08 04:51:03.729052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.622 [2024-11-08 04:51:03.729186] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.622 [2024-11-08 04:51:03.729206] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.622 [2024-11-08 04:51:03.729353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:72720272 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.622 [2024-11-08 04:51:03.729381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:28.880 #35 NEW cov: 11889 ft: 14634 corp: 22/410b lim: 30 exec/s: 35 rss: 70Mb L: 28/29 MS: 1 InsertRepeatedBytes- 00:07:28.880 [2024-11-08 04:51:03.788093] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.788286] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.788467] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.788880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace028e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.880 [2024-11-08 04:51:03.788911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.880 [2024-11-08 04:51:03.789045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.880 [2024-11-08 04:51:03.789063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.880 [2024-11-08 04:51:03.789195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.880 [2024-11-08 04:51:03.789219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.880 #36 NEW cov: 11889 ft: 14639 corp: 23/429b lim: 30 exec/s: 36 rss: 70Mb L: 19/29 MS: 1 ChangeBit- 00:07:28.880 [2024-11-08 04:51:03.838235] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.838417] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.838598] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.880 [2024-11-08 04:51:03.838965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.838994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.881 [2024-11-08 04:51:03.839120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.839141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.881 [2024-11-08 04:51:03.839278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.839295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.881 #37 NEW cov: 11889 ft: 14652 corp: 24/451b lim: 30 exec/s: 37 rss: 70Mb L: 22/29 MS: 1 ShuffleBytes- 00:07:28.881 [2024-11-08 04:51:03.898441] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (535356) > buf size (4096) 00:07:28.881 [2024-11-08 04:51:03.898635] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.881 [2024-11-08 04:51:03.898806] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.881 [2024-11-08 04:51:03.899170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.899199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.881 [2024-11-08 04:51:03.899336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000200 cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.899353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:28.881 [2024-11-08 04:51:03.899484] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.899502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:28.881 #38 NEW cov: 11889 ft: 14668 corp: 25/473b lim: 30 exec/s: 38 rss: 70Mb L: 22/29 MS: 1 CMP- DE: " \000\000\000"- 00:07:28.881 [2024-11-08 04:51:03.958452] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:28.881 [2024-11-08 04:51:03.958852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:28.881 [2024-11-08 04:51:03.958881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:28.881 #39 NEW cov: 11889 ft: 14694 corp: 26/481b lim: 30 exec/s: 39 rss: 70Mb L: 8/29 MS: 1 CrossOver- 00:07:29.140 [2024-11-08 04:51:04.008872] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.009055] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (1048576) > buf size (4096) 00:07:29.140 [2024-11-08 04:51:04.009241] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:29.140 [2024-11-08 04:51:04.009407] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.009820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:23ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.009850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.009982] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.010000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.010140] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff00ff cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.010160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.010294] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:00008300 cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.010312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.140 #40 NEW cov: 11889 ft: 14734 corp: 27/508b lim: 30 exec/s: 40 rss: 70Mb L: 27/29 MS: 1 CrossOver- 00:07:29.140 [2024-11-08 04:51:04.059071] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.059253] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x200000ace 00:07:29.140 [2024-11-08 04:51:04.059429] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.059608] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.059991] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.060019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.060153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.060173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.060308] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.060325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.060456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.060477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.140 #41 NEW cov: 11889 ft: 14783 corp: 28/537b lim: 30 exec/s: 41 rss: 70Mb L: 29/29 MS: 1 ShuffleBytes- 00:07:29.140 [2024-11-08 04:51:04.109105] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.109297] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0xfe 00:07:29.140 [2024-11-08 04:51:04.109478] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.109882] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.109915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.110050] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.110069] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.110202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:00ff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.110220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.140 #42 NEW cov: 11889 ft: 14885 corp: 29/556b lim: 30 exec/s: 42 rss: 70Mb L: 19/29 MS: 1 InsertByte- 00:07:29.140 [2024-11-08 04:51:04.159392] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:29.140 [2024-11-08 04:51:04.159600] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.159795] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.140 [2024-11-08 04:51:04.159970] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:29.140 [2024-11-08 04:51:04.160345] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.160374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.160500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.160518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.160657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.160673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.140 [2024-11-08 04:51:04.160814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.160830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.140 #43 NEW cov: 11889 ft: 14895 corp: 30/580b lim: 30 exec/s: 43 rss: 70Mb L: 24/29 MS: 1 CrossOver- 00:07:29.140 [2024-11-08 04:51:04.209473] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.209668] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.209851] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.140 [2024-11-08 04:51:04.210249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0ace028e cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.140 [2024-11-08 04:51:04.210281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.141 [2024-11-08 04:51:04.210418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.141 [2024-11-08 04:51:04.210437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.141 [2024-11-08 04:51:04.210576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.141 [2024-11-08 04:51:04.210598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.141 #44 NEW cov: 11889 ft: 14997 corp: 31/603b lim: 30 exec/s: 44 rss: 70Mb L: 23/29 MS: 1 PersAutoDict- DE: " \000\000\000"- 00:07:29.399 [2024-11-08 04:51:04.269626] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cece 00:07:29.399 [2024-11-08 04:51:04.269814] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x20000cec6 00:07:29.399 [2024-11-08 04:51:04.270216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:0a9b02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.270245] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.399 [2024-11-08 04:51:04.270372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:cece02ce cdw11:00000002 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.270389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.399 #45 NEW cov: 11889 ft: 15009 corp: 32/616b lim: 30 exec/s: 45 rss: 70Mb L: 13/29 MS: 1 InsertByte- 00:07:29.399 [2024-11-08 04:51:04.329885] ctrlr.c:2516:nvmf_ctrlr_get_log_page: *ERROR*: Get log page: len (262144) > buf size (4096) 00:07:29.399 [2024-11-08 04:51:04.330072] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ffff 00:07:29.399 [2024-11-08 04:51:04.330255] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff2e 00:07:29.399 [2024-11-08 04:51:04.330437] ctrlr.c:2504:nvmf_ctrlr_get_log_page: *ERROR*: Invalid log page offset 0x30000ff0a 00:07:29.399 [2024-11-08 04:51:04.330857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:4 nsid:0 cdw10:ffff0000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.330887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.399 [2024-11-08 04:51:04.331026] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:5 nsid:0 cdw10:000083ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.331045] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:29.399 [2024-11-08 04:51:04.331183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:6 nsid:0 cdw10:ffce83ce cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.331202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:29.399 [2024-11-08 04:51:04.331346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: GET LOG PAGE (02) qid:0 cid:7 nsid:0 cdw10:ffff83ff cdw11:00000003 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.399 [2024-11-08 04:51:04.331365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:29.399 #46 NEW cov: 11889 ft: 15016 corp: 33/640b lim: 30 exec/s: 23 rss: 70Mb L: 24/29 MS: 1 CrossOver- 00:07:29.399 #46 DONE cov: 11889 ft: 15016 corp: 33/640b lim: 30 exec/s: 23 rss: 70Mb 00:07:29.399 ###### Recommended dictionary. ###### 00:07:29.399 "\377\377\3777" # Uses: 0 00:07:29.399 " \000\000\000" # Uses: 1 00:07:29.399 ###### End of recommended dictionary. ###### 00:07:29.399 Done 46 runs in 2 second(s) 00:07:29.399 04:51:04 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_1.conf 00:07:29.399 04:51:04 -- ../common.sh@72 -- # (( i++ )) 00:07:29.399 04:51:04 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:29.399 04:51:04 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:07:29.399 04:51:04 -- nvmf/run.sh@23 -- # local fuzzer_type=2 00:07:29.399 04:51:04 -- nvmf/run.sh@24 -- # local timen=1 00:07:29.399 04:51:04 -- nvmf/run.sh@25 -- # local core=0x1 00:07:29.399 04:51:04 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:29.399 04:51:04 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_2.conf 00:07:29.399 04:51:04 -- nvmf/run.sh@29 -- # printf %02d 2 00:07:29.399 04:51:04 -- nvmf/run.sh@29 -- # port=4402 00:07:29.399 04:51:04 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:29.399 04:51:04 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' 00:07:29.400 04:51:04 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4402"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:29.400 04:51:04 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4402' -c /tmp/fuzz_json_2.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 -Z 2 -r /var/tmp/spdk2.sock 00:07:29.658 [2024-11-08 04:51:04.519665] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:29.658 [2024-11-08 04:51:04.519728] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3679421 ] 00:07:29.658 EAL: No free 2048 kB hugepages reported on node 1 00:07:29.916 [2024-11-08 04:51:04.774556] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.916 [2024-11-08 04:51:04.862970] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:29.916 [2024-11-08 04:51:04.863104] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:29.916 [2024-11-08 04:51:04.921840] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:29.916 [2024-11-08 04:51:04.938140] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4402 *** 00:07:29.916 INFO: Running with entropic power schedule (0xFF, 100). 00:07:29.916 INFO: Seed: 1232175366 00:07:29.916 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:29.916 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:29.916 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_2 00:07:29.916 INFO: A corpus is not provided, starting from an empty corpus 00:07:29.916 #2 INITED exec/s: 0 rss: 60Mb 00:07:29.916 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:29.916 This may also happen if the target rejected all inputs we tried so far 00:07:29.916 [2024-11-08 04:51:04.987012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.916 [2024-11-08 04:51:04.987042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:29.916 [2024-11-08 04:51:04.987095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:29.916 [2024-11-08 04:51:04.987110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.174 NEW_FUNC[1/670]: 0x43db78 in fuzz_admin_identify_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:95 00:07:30.174 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:30.433 #4 NEW cov: 11580 ft: 11581 corp: 2/19b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 2 InsertByte-InsertRepeatedBytes- 00:07:30.433 [2024-11-08 04:51:05.297802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.297837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.297893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3737003f cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.297914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 #5 NEW cov: 11693 ft: 12121 corp: 3/37b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ChangeByte- 00:07:30.433 [2024-11-08 04:51:05.347862] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.347890] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.347946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.347961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 #6 NEW cov: 11699 ft: 12346 corp: 4/55b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ChangeBit- 00:07:30.433 [2024-11-08 04:51:05.387942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.387969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.388022] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:33003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.388035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 #7 NEW cov: 11784 ft: 12536 corp: 5/73b lim: 35 exec/s: 0 rss: 68Mb L: 18/18 MS: 1 ChangeASCIIInt- 00:07:30.433 [2024-11-08 04:51:05.428054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.428081] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.428138] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:33003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.428155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 #8 NEW cov: 11784 ft: 12629 corp: 6/92b lim: 35 exec/s: 0 rss: 68Mb L: 19/19 MS: 1 InsertByte- 00:07:30.433 [2024-11-08 04:51:05.468487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.468514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.468572] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.468587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.468640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:38003338 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.468655] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.468710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37330037 cdw11:70003838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.468723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.433 #9 NEW cov: 11784 ft: 13237 corp: 7/122b lim: 35 exec/s: 0 rss: 68Mb L: 30/30 MS: 1 CopyPart- 00:07:30.433 [2024-11-08 04:51:05.508323] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.508349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.433 [2024-11-08 04:51:05.508403] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.433 [2024-11-08 04:51:05.508418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.433 #10 NEW cov: 11784 ft: 13308 corp: 8/140b lim: 35 exec/s: 0 rss: 68Mb L: 18/30 MS: 1 ChangeASCIIInt- 00:07:30.692 [2024-11-08 04:51:05.548427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.548453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.548507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c8c800c9 cdw11:3700c937 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.548521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 #11 NEW cov: 11784 ft: 13447 corp: 9/158b lim: 35 exec/s: 0 rss: 69Mb L: 18/30 MS: 1 ChangeBinInt- 00:07:30.692 [2024-11-08 04:51:05.588598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.588624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.588679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37002f37 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.588694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 #12 NEW cov: 11784 ft: 13502 corp: 10/177b lim: 35 exec/s: 0 rss: 69Mb L: 19/30 MS: 1 InsertByte- 00:07:30.692 [2024-11-08 04:51:05.628385] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:30.692 [2024-11-08 04:51:05.628509] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:30.692 [2024-11-08 04:51:05.628721] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.628748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.628802] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.628819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 #14 NEW cov: 11793 ft: 13593 corp: 11/196b lim: 35 exec/s: 0 rss: 69Mb L: 19/30 MS: 2 CopyPart-InsertRepeatedBytes- 00:07:30.692 [2024-11-08 04:51:05.668756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.668782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.668838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:377e0037 cdw11:33003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.668852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 #15 NEW cov: 11793 ft: 13632 corp: 12/214b lim: 35 exec/s: 0 rss: 69Mb L: 18/30 MS: 1 ChangeByte- 00:07:30.692 [2024-11-08 04:51:05.709008] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.709034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.709089] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37010037 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.709103] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.709157] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:17003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.709171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.692 #16 NEW cov: 11793 ft: 13823 corp: 13/236b lim: 35 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:30.692 [2024-11-08 04:51:05.749114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:12003712 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.749139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.749195] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:12120012 cdw11:37001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.749210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.749261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c9c80037 cdw11:3700c8c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.749276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.692 #17 NEW cov: 11793 ft: 13846 corp: 14/262b lim: 35 exec/s: 0 rss: 69Mb L: 26/30 MS: 1 InsertRepeatedBytes- 00:07:30.692 [2024-11-08 04:51:05.789276] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.789301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.789355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.789369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.692 [2024-11-08 04:51:05.789421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.692 [2024-11-08 04:51:05.789435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.951 #18 NEW cov: 11793 ft: 13877 corp: 15/287b lim: 35 exec/s: 0 rss: 69Mb L: 25/30 MS: 1 CrossOver- 00:07:30.951 [2024-11-08 04:51:05.829504] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0ac80096 cdw11:3700c937 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.829534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.829589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:12120012 cdw11:12001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.829605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.829657] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370012 cdw11:c80037c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.829671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.829724] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:373700c9 cdw11:17003736 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.829737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:30.951 #19 NEW cov: 11793 ft: 13903 corp: 16/316b lim: 35 exec/s: 0 rss: 69Mb L: 29/30 MS: 1 CrossOver- 00:07:30.951 [2024-11-08 04:51:05.869492] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.869519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.869636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:17003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.869651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.951 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:30.951 #20 NEW cov: 11816 ft: 14290 corp: 17/338b lim: 35 exec/s: 0 rss: 69Mb L: 22/30 MS: 1 ShuffleBytes- 00:07:30.951 [2024-11-08 04:51:05.919648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.919674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.919727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.919742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.919797] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.919810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.951 #21 NEW cov: 11816 ft: 14348 corp: 18/364b lim: 35 exec/s: 0 rss: 69Mb L: 26/30 MS: 1 InsertByte- 00:07:30.951 [2024-11-08 04:51:05.959505] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:30.951 [2024-11-08 04:51:05.959855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.959880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.959935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.959950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.951 #22 NEW cov: 11816 ft: 14449 corp: 19/391b lim: 35 exec/s: 22 rss: 69Mb L: 27/30 MS: 1 InsertRepeatedBytes- 00:07:30.951 [2024-11-08 04:51:05.999885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:12003712 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.999915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:05.999971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:12120012 cdw11:37001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:05.999986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:06.000039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c9c80037 cdw11:3700c8c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:06.000054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:30.951 #23 NEW cov: 11816 ft: 14488 corp: 20/417b lim: 35 exec/s: 23 rss: 69Mb L: 26/30 MS: 1 ChangeASCIIInt- 00:07:30.951 [2024-11-08 04:51:06.039976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:06.040002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:30.951 [2024-11-08 04:51:06.040112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:25003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:30.951 [2024-11-08 04:51:06.040127] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.209 #24 NEW cov: 11816 ft: 14503 corp: 21/440b lim: 35 exec/s: 24 rss: 69Mb L: 23/30 MS: 1 InsertByte- 00:07:31.209 [2024-11-08 04:51:06.079983] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.080009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.209 [2024-11-08 04:51:06.080063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37002f37 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.080076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.209 #30 NEW cov: 11816 ft: 14520 corp: 22/459b lim: 35 exec/s: 30 rss: 69Mb L: 19/30 MS: 1 CopyPart- 00:07:31.209 [2024-11-08 04:51:06.120083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.120109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.209 [2024-11-08 04:51:06.120163] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:00003701 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.120177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.209 #31 NEW cov: 11816 ft: 14602 corp: 23/477b lim: 35 exec/s: 31 rss: 69Mb L: 18/30 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:31.209 [2024-11-08 04:51:06.160328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.160353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.209 [2024-11-08 04:51:06.160466] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:17003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.209 [2024-11-08 04:51:06.160482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.209 #32 NEW cov: 11816 ft: 14638 corp: 24/499b lim: 35 exec/s: 32 rss: 69Mb L: 22/30 MS: 1 CopyPart- 00:07:31.210 [2024-11-08 04:51:06.200407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.210 [2024-11-08 04:51:06.200432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.210 [2024-11-08 04:51:06.200488] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:38010037 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.210 [2024-11-08 04:51:06.200502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.210 [2024-11-08 04:51:06.200558] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:17003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.210 [2024-11-08 04:51:06.200571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.210 #33 NEW cov: 11816 ft: 14648 corp: 25/521b lim: 35 exec/s: 33 rss: 69Mb L: 22/30 MS: 1 ChangeASCIIInt- 00:07:31.210 [2024-11-08 04:51:06.240268] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.210 [2024-11-08 04:51:06.240293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.210 #34 NEW cov: 11816 ft: 14971 corp: 26/533b lim: 35 exec/s: 34 rss: 69Mb L: 12/30 MS: 1 InsertRepeatedBytes- 00:07:31.210 #35 NEW cov: 11816 ft: 15063 corp: 27/545b lim: 35 exec/s: 35 rss: 69Mb L: 12/30 MS: 1 PersAutoDict- DE: "\001\000\000\000"- 00:07:31.468 [2024-11-08 04:51:06.320820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.320846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.320962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:17003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.320977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.468 #36 NEW cov: 11816 ft: 15076 corp: 28/567b lim: 35 exec/s: 36 rss: 69Mb L: 22/30 MS: 1 ChangeBit- 00:07:31.468 [2024-11-08 04:51:06.360617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:ffff000a cdw11:ff00ffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.360643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 #37 NEW cov: 11816 ft: 15085 corp: 29/579b lim: 35 exec/s: 37 rss: 69Mb L: 12/30 MS: 1 ShuffleBytes- 00:07:31.468 [2024-11-08 04:51:06.400850] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:370037c8 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.400875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.400929] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.400943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.468 #38 NEW cov: 11816 ft: 15116 corp: 30/598b lim: 35 exec/s: 38 rss: 69Mb L: 19/30 MS: 1 InsertByte- 00:07:31.468 [2024-11-08 04:51:06.441216] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.441242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.441296] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.441314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.441369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:38003337 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.441384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.441439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37330038 cdw11:70003838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.441453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.468 #39 NEW cov: 11816 ft: 15136 corp: 31/628b lim: 35 exec/s: 39 rss: 69Mb L: 30/30 MS: 1 ShuffleBytes- 00:07:31.468 [2024-11-08 04:51:06.481341] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.481366] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.481421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3737003f cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.481435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.481489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:3737000a cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.481503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.481576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.481590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.468 #40 NEW cov: 11816 ft: 15151 corp: 32/661b lim: 35 exec/s: 40 rss: 70Mb L: 33/33 MS: 1 CopyPart- 00:07:31.468 [2024-11-08 04:51:06.521231] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.468 [2024-11-08 04:51:06.521256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.468 [2024-11-08 04:51:06.521311] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:00003701 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.469 [2024-11-08 04:51:06.521325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.469 #41 NEW cov: 11816 ft: 15165 corp: 33/679b lim: 35 exec/s: 41 rss: 70Mb L: 18/33 MS: 1 ChangeASCIIInt- 00:07:31.469 [2024-11-08 04:51:06.561470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:120037a2 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.469 [2024-11-08 04:51:06.561496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.469 [2024-11-08 04:51:06.561565] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:12120012 cdw11:37001212 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.469 [2024-11-08 04:51:06.561580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.469 [2024-11-08 04:51:06.561634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:c9c80037 cdw11:3700c8c9 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.469 [2024-11-08 04:51:06.561651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.727 #42 NEW cov: 11816 ft: 15235 corp: 34/705b lim: 35 exec/s: 42 rss: 70Mb L: 26/33 MS: 1 ChangeByte- 00:07:31.727 [2024-11-08 04:51:06.601606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.601632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.601688] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.601703] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.601758] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.601773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.727 #43 NEW cov: 11816 ft: 15240 corp: 35/730b lim: 35 exec/s: 43 rss: 70Mb L: 25/33 MS: 1 ChangeByte- 00:07:31.727 [2024-11-08 04:51:06.641577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.641603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.641659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:c8c800c9 cdw11:3700c937 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.641674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.727 #44 NEW cov: 11816 ft: 15248 corp: 36/748b lim: 35 exec/s: 44 rss: 70Mb L: 18/33 MS: 1 ChangeASCIIInt- 00:07:31.727 [2024-11-08 04:51:06.681541] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:31.727 [2024-11-08 04:51:06.681956] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:5d000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.681982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.682037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00370000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.682054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.682108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:01000037 cdw11:37000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.682122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.682175] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37370037 cdw11:37003717 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.682189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.727 #45 NEW cov: 11816 ft: 15249 corp: 37/776b lim: 35 exec/s: 45 rss: 70Mb L: 28/33 MS: 1 InsertByte- 00:07:31.727 [2024-11-08 04:51:06.722085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.722111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.722170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:377e0037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.722185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.722242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37330037 cdw11:38003838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.722256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.727 [2024-11-08 04:51:06.722309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:33380037 cdw11:38003838 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.722322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.727 #46 NEW cov: 11816 ft: 15254 corp: 38/804b lim: 35 exec/s: 46 rss: 70Mb L: 28/33 MS: 1 CrossOver- 00:07:31.727 [2024-11-08 04:51:06.761869] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:31.727 [2024-11-08 04:51:06.762314] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.727 [2024-11-08 04:51:06.762341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.728 [2024-11-08 04:51:06.762398] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.762414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.728 [2024-11-08 04:51:06.762529] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37370037 cdw11:96001737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.762545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.728 #47 NEW cov: 11816 ft: 15272 corp: 39/833b lim: 35 exec/s: 47 rss: 70Mb L: 29/33 MS: 1 CrossOver- 00:07:31.728 [2024-11-08 04:51:06.801913] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:31.728 [2024-11-08 04:51:06.802313] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:5d000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.802339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.728 [2024-11-08 04:51:06.802396] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:00370000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.802412] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.728 [2024-11-08 04:51:06.802467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:01000037 cdw11:37000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.802481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.728 [2024-11-08 04:51:06.802536] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37370037 cdw11:37003617 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.728 [2024-11-08 04:51:06.802550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.728 #48 NEW cov: 11816 ft: 15326 corp: 40/861b lim: 35 exec/s: 48 rss: 70Mb L: 28/33 MS: 1 ChangeASCIIInt- 00:07:31.986 [2024-11-08 04:51:06.842307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.986 [2024-11-08 04:51:06.842332] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.986 [2024-11-08 04:51:06.842385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.986 [2024-11-08 04:51:06.842398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.986 [2024-11-08 04:51:06.842453] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:38003338 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.986 [2024-11-08 04:51:06.842467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.986 #49 NEW cov: 11816 ft: 15333 corp: 41/882b lim: 35 exec/s: 49 rss: 70Mb L: 21/33 MS: 1 EraseBytes- 00:07:31.986 [2024-11-08 04:51:06.882537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.986 [2024-11-08 04:51:06.882562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.986 [2024-11-08 04:51:06.882617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:3737003f cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.986 [2024-11-08 04:51:06.882631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.986 [2024-11-08 04:51:06.882683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:372d000a cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.882698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.987 [2024-11-08 04:51:06.882748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:37370037 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.882762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.987 #50 NEW cov: 11816 ft: 15352 corp: 42/915b lim: 35 exec/s: 50 rss: 70Mb L: 33/33 MS: 1 ChangeByte- 00:07:31.987 [2024-11-08 04:51:06.922238] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:31.987 [2024-11-08 04:51:06.922478] ctrlr.c:2598:_nvmf_subsystem_get_ns_safe: *ERROR*: Identify Namespace for invalid NSID 0 00:07:31.987 [2024-11-08 04:51:06.922713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.922739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.987 [2024-11-08 04:51:06.922795] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:5 nsid:0 cdw10:37370000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.922810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:31.987 [2024-11-08 04:51:06.922917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.922934] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:31.987 #51 NEW cov: 11816 ft: 15363 corp: 43/948b lim: 35 exec/s: 51 rss: 70Mb L: 33/33 MS: 1 CopyPart- 00:07:31.987 [2024-11-08 04:51:06.962663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:4 nsid:0 cdw10:0a370096 cdw11:37003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.962691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:31.987 [2024-11-08 04:51:06.962799] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: IDENTIFY (06) qid:0 cid:6 nsid:0 cdw10:37370037 cdw11:2d003737 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:31.987 [2024-11-08 04:51:06.962814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:31.987 #52 NEW cov: 11816 ft: 15380 corp: 44/971b lim: 35 exec/s: 26 rss: 70Mb L: 23/33 MS: 1 InsertByte- 00:07:31.987 #52 DONE cov: 11816 ft: 15380 corp: 44/971b lim: 35 exec/s: 26 rss: 70Mb 00:07:31.987 ###### Recommended dictionary. ###### 00:07:31.987 "\001\000\000\000" # Uses: 2 00:07:31.987 ###### End of recommended dictionary. ###### 00:07:31.987 Done 52 runs in 2 second(s) 00:07:32.245 04:51:07 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_2.conf 00:07:32.245 04:51:07 -- ../common.sh@72 -- # (( i++ )) 00:07:32.245 04:51:07 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:32.245 04:51:07 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:07:32.246 04:51:07 -- nvmf/run.sh@23 -- # local fuzzer_type=3 00:07:32.246 04:51:07 -- nvmf/run.sh@24 -- # local timen=1 00:07:32.246 04:51:07 -- nvmf/run.sh@25 -- # local core=0x1 00:07:32.246 04:51:07 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:32.246 04:51:07 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_3.conf 00:07:32.246 04:51:07 -- nvmf/run.sh@29 -- # printf %02d 3 00:07:32.246 04:51:07 -- nvmf/run.sh@29 -- # port=4403 00:07:32.246 04:51:07 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:32.246 04:51:07 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' 00:07:32.246 04:51:07 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4403"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:32.246 04:51:07 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4403' -c /tmp/fuzz_json_3.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 -Z 3 -r /var/tmp/spdk3.sock 00:07:32.246 [2024-11-08 04:51:07.156845] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:32.246 [2024-11-08 04:51:07.156910] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3680145 ] 00:07:32.246 EAL: No free 2048 kB hugepages reported on node 1 00:07:32.246 [2024-11-08 04:51:07.330654] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.504 [2024-11-08 04:51:07.394381] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:32.504 [2024-11-08 04:51:07.394506] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.504 [2024-11-08 04:51:07.452495] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:32.504 [2024-11-08 04:51:07.468828] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4403 *** 00:07:32.504 INFO: Running with entropic power schedule (0xFF, 100). 00:07:32.504 INFO: Seed: 3762786197 00:07:32.504 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:32.504 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:32.504 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_3 00:07:32.504 INFO: A corpus is not provided, starting from an empty corpus 00:07:32.504 #2 INITED exec/s: 0 rss: 60Mb 00:07:32.504 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:32.504 This may also happen if the target rejected all inputs we tried so far 00:07:32.504 [2024-11-08 04:51:07.518111] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.504 [2024-11-08 04:51:07.518144] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:32.763 NEW_FUNC[1/678]: 0x43f858 in fuzz_admin_abort_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:114 00:07:32.763 NEW_FUNC[2/678]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:32.763 #3 NEW cov: 11790 ft: 11810 corp: 2/12b lim: 20 exec/s: 0 rss: 68Mb L: 11/11 MS: 1 InsertRepeatedBytes- 00:07:32.763 NEW_FUNC[1/1]: 0x1ca9de8 in thread_execute_poller /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:934 00:07:32.763 #4 NEW cov: 11945 ft: 12583 corp: 3/29b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 InsertRepeatedBytes- 00:07:32.763 [2024-11-08 04:51:07.859084] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:32.763 [2024-11-08 04:51:07.859119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.021 #5 NEW cov: 11958 ft: 13078 corp: 4/41b lim: 20 exec/s: 0 rss: 68Mb L: 12/17 MS: 1 InsertByte- 00:07:33.021 #6 NEW cov: 12043 ft: 13366 corp: 5/53b lim: 20 exec/s: 0 rss: 68Mb L: 12/17 MS: 1 CMP- DE: "\377\377~I$\016p\321"- 00:07:33.021 #7 NEW cov: 12043 ft: 13483 corp: 6/70b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 ChangeByte- 00:07:33.021 #8 NEW cov: 12043 ft: 13603 corp: 7/87b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 ChangeBit- 00:07:33.021 #9 NEW cov: 12043 ft: 13689 corp: 8/104b lim: 20 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 ShuffleBytes- 00:07:33.021 [2024-11-08 04:51:08.069867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.021 [2024-11-08 04:51:08.069895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.021 #10 NEW cov: 12043 ft: 13829 corp: 9/124b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:33.279 #11 NEW cov: 12043 ft: 13898 corp: 10/141b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:33.279 [2024-11-08 04:51:08.150069] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.279 [2024-11-08 04:51:08.150094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.279 #12 NEW cov: 12043 ft: 13921 corp: 11/161b lim: 20 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 InsertRepeatedBytes- 00:07:33.279 #13 NEW cov: 12043 ft: 13995 corp: 12/178b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 ChangeByte- 00:07:33.279 #14 NEW cov: 12043 ft: 14032 corp: 13/195b lim: 20 exec/s: 0 rss: 68Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:33.279 #15 NEW cov: 12043 ft: 14058 corp: 14/212b lim: 20 exec/s: 0 rss: 69Mb L: 17/20 MS: 1 CrossOver- 00:07:33.279 #18 NEW cov: 12043 ft: 14165 corp: 15/222b lim: 20 exec/s: 0 rss: 69Mb L: 10/20 MS: 3 CrossOver-ChangeByte-PersAutoDict- DE: "\377\377~I$\016p\321"- 00:07:33.279 #19 NEW cov: 12043 ft: 14229 corp: 16/234b lim: 20 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 ChangeBit- 00:07:33.537 [2024-11-08 04:51:08.390614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.537 [2024-11-08 04:51:08.390643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.537 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:33.537 #20 NEW cov: 12066 ft: 14260 corp: 17/246b lim: 20 exec/s: 0 rss: 69Mb L: 12/20 MS: 1 CopyPart- 00:07:33.537 [2024-11-08 04:51:08.430533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.537 [2024-11-08 04:51:08.430561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.537 #21 NEW cov: 12066 ft: 14283 corp: 18/257b lim: 20 exec/s: 0 rss: 69Mb L: 11/20 MS: 1 ChangeBinInt- 00:07:33.537 [2024-11-08 04:51:08.471037] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.537 [2024-11-08 04:51:08.471064] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.537 #22 NEW cov: 12066 ft: 14312 corp: 19/277b lim: 20 exec/s: 0 rss: 69Mb L: 20/20 MS: 1 CopyPart- 00:07:33.537 #23 NEW cov: 12066 ft: 14372 corp: 20/292b lim: 20 exec/s: 23 rss: 69Mb L: 15/20 MS: 1 EraseBytes- 00:07:33.537 [2024-11-08 04:51:08.561087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.537 [2024-11-08 04:51:08.561115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.537 #24 NEW cov: 12066 ft: 14474 corp: 21/304b lim: 20 exec/s: 24 rss: 69Mb L: 12/20 MS: 1 InsertByte- 00:07:33.537 #25 NEW cov: 12066 ft: 14533 corp: 22/321b lim: 20 exec/s: 25 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:33.796 #26 NEW cov: 12066 ft: 14576 corp: 23/338b lim: 20 exec/s: 26 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:33.796 #27 NEW cov: 12066 ft: 14579 corp: 24/355b lim: 20 exec/s: 27 rss: 69Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:33.796 [2024-11-08 04:51:08.721549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.796 [2024-11-08 04:51:08.721575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.796 #28 NEW cov: 12066 ft: 14640 corp: 25/372b lim: 20 exec/s: 28 rss: 69Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:33.796 #29 NEW cov: 12066 ft: 14647 corp: 26/391b lim: 20 exec/s: 29 rss: 69Mb L: 19/20 MS: 1 CopyPart- 00:07:33.796 [2024-11-08 04:51:08.801988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:33.796 [2024-11-08 04:51:08.802013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:33.796 #30 NEW cov: 12066 ft: 14657 corp: 27/411b lim: 20 exec/s: 30 rss: 70Mb L: 20/20 MS: 1 CopyPart- 00:07:33.796 #31 NEW cov: 12066 ft: 14685 corp: 28/428b lim: 20 exec/s: 31 rss: 70Mb L: 17/20 MS: 1 ChangeBinInt- 00:07:34.054 #32 NEW cov: 12066 ft: 14697 corp: 29/445b lim: 20 exec/s: 32 rss: 70Mb L: 17/20 MS: 1 ShuffleBytes- 00:07:34.054 [2024-11-08 04:51:08.932151] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.054 [2024-11-08 04:51:08.932179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.054 #33 NEW cov: 12066 ft: 14747 corp: 30/459b lim: 20 exec/s: 33 rss: 70Mb L: 14/20 MS: 1 InsertRepeatedBytes- 00:07:34.054 #34 NEW cov: 12066 ft: 14772 corp: 31/469b lim: 20 exec/s: 34 rss: 70Mb L: 10/20 MS: 1 ChangeByte- 00:07:34.054 [2024-11-08 04:51:09.012382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.054 [2024-11-08 04:51:09.012408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.054 #35 NEW cov: 12066 ft: 14849 corp: 32/482b lim: 20 exec/s: 35 rss: 70Mb L: 13/20 MS: 1 EraseBytes- 00:07:34.054 [2024-11-08 04:51:09.052745] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.054 [2024-11-08 04:51:09.052770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.054 #36 NEW cov: 12066 ft: 14877 corp: 33/502b lim: 20 exec/s: 36 rss: 70Mb L: 20/20 MS: 1 CrossOver- 00:07:34.054 [2024-11-08 04:51:09.092679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.054 [2024-11-08 04:51:09.092706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.054 #37 NEW cov: 12066 ft: 14948 corp: 34/521b lim: 20 exec/s: 37 rss: 70Mb L: 19/20 MS: 1 PersAutoDict- DE: "\377\377~I$\016p\321"- 00:07:34.054 #38 NEW cov: 12066 ft: 14960 corp: 35/539b lim: 20 exec/s: 38 rss: 70Mb L: 18/20 MS: 1 InsertByte- 00:07:34.313 #39 NEW cov: 12066 ft: 14983 corp: 36/556b lim: 20 exec/s: 39 rss: 70Mb L: 17/20 MS: 1 PersAutoDict- DE: "\377\377~I$\016p\321"- 00:07:34.313 [2024-11-08 04:51:09.213085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.313 [2024-11-08 04:51:09.213111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:34.313 #40 NEW cov: 12066 ft: 15025 corp: 37/568b lim: 20 exec/s: 40 rss: 70Mb L: 12/20 MS: 1 ChangeBinInt- 00:07:34.313 [2024-11-08 04:51:09.253007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.313 [2024-11-08 04:51:09.253034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.313 #41 NEW cov: 12066 ft: 15032 corp: 38/578b lim: 20 exec/s: 41 rss: 70Mb L: 10/20 MS: 1 EraseBytes- 00:07:34.313 #42 NEW cov: 12066 ft: 15044 corp: 39/593b lim: 20 exec/s: 42 rss: 70Mb L: 15/20 MS: 1 CopyPart- 00:07:34.313 #43 NEW cov: 12066 ft: 15060 corp: 40/612b lim: 20 exec/s: 43 rss: 70Mb L: 19/20 MS: 1 CrossOver- 00:07:34.313 #44 NEW cov: 12066 ft: 15072 corp: 41/629b lim: 20 exec/s: 44 rss: 70Mb L: 17/20 MS: 1 CrossOver- 00:07:34.572 #45 NEW cov: 12066 ft: 15080 corp: 42/646b lim: 20 exec/s: 45 rss: 70Mb L: 17/20 MS: 1 ChangeBit- 00:07:34.572 [2024-11-08 04:51:09.453751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:0 nsid:0 cdw10:00000000 cdw11:00000000 00:07:34.572 [2024-11-08 04:51:09.453779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:0 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:34.572 #46 NEW cov: 12066 ft: 15086 corp: 43/658b lim: 20 exec/s: 46 rss: 70Mb L: 12/20 MS: 1 EraseBytes- 00:07:34.572 #47 NEW cov: 12066 ft: 15124 corp: 44/667b lim: 20 exec/s: 23 rss: 70Mb L: 9/20 MS: 1 CrossOver- 00:07:34.572 #47 DONE cov: 12066 ft: 15124 corp: 44/667b lim: 20 exec/s: 23 rss: 70Mb 00:07:34.572 ###### Recommended dictionary. ###### 00:07:34.572 "\377\377~I$\016p\321" # Uses: 3 00:07:34.572 ###### End of recommended dictionary. ###### 00:07:34.572 Done 47 runs in 2 second(s) 00:07:34.572 04:51:09 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_3.conf 00:07:34.572 04:51:09 -- ../common.sh@72 -- # (( i++ )) 00:07:34.572 04:51:09 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:34.572 04:51:09 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:07:34.572 04:51:09 -- nvmf/run.sh@23 -- # local fuzzer_type=4 00:07:34.572 04:51:09 -- nvmf/run.sh@24 -- # local timen=1 00:07:34.572 04:51:09 -- nvmf/run.sh@25 -- # local core=0x1 00:07:34.572 04:51:09 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:34.572 04:51:09 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_4.conf 00:07:34.572 04:51:09 -- nvmf/run.sh@29 -- # printf %02d 4 00:07:34.572 04:51:09 -- nvmf/run.sh@29 -- # port=4404 00:07:34.572 04:51:09 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:34.572 04:51:09 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' 00:07:34.572 04:51:09 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4404"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:34.572 04:51:09 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4404' -c /tmp/fuzz_json_4.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 -Z 4 -r /var/tmp/spdk4.sock 00:07:34.830 [2024-11-08 04:51:09.685893] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:34.831 [2024-11-08 04:51:09.685974] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3680490 ] 00:07:34.831 EAL: No free 2048 kB hugepages reported on node 1 00:07:34.831 [2024-11-08 04:51:09.863254] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.831 [2024-11-08 04:51:09.927239] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:34.831 [2024-11-08 04:51:09.927364] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:35.088 [2024-11-08 04:51:09.985531] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:35.088 [2024-11-08 04:51:10.002047] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4404 *** 00:07:35.088 INFO: Running with entropic power schedule (0xFF, 100). 00:07:35.088 INFO: Seed: 2001816447 00:07:35.088 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:35.088 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:35.088 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_4 00:07:35.088 INFO: A corpus is not provided, starting from an empty corpus 00:07:35.088 #2 INITED exec/s: 0 rss: 61Mb 00:07:35.088 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:35.088 This may also happen if the target rejected all inputs we tried so far 00:07:35.088 [2024-11-08 04:51:10.047464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.088 [2024-11-08 04:51:10.047497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.347 NEW_FUNC[1/671]: 0x440958 in fuzz_admin_create_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:126 00:07:35.347 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:35.347 #5 NEW cov: 11598 ft: 11602 corp: 2/10b lim: 35 exec/s: 0 rss: 68Mb L: 9/9 MS: 3 ShuffleBytes-ChangeBit-CMP- DE: "\000\000\000\000\000\000\000_"- 00:07:35.347 [2024-11-08 04:51:10.368160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.347 [2024-11-08 04:51:10.368205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.347 [2024-11-08 04:51:10.368271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.347 [2024-11-08 04:51:10.368290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.347 #6 NEW cov: 11714 ft: 12847 corp: 3/25b lim: 35 exec/s: 0 rss: 68Mb L: 15/15 MS: 1 InsertRepeatedBytes- 00:07:35.347 [2024-11-08 04:51:10.417970] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.347 [2024-11-08 04:51:10.417996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.347 #11 NEW cov: 11720 ft: 13011 corp: 4/34b lim: 35 exec/s: 0 rss: 68Mb L: 9/15 MS: 5 ShuffleBytes-ChangeBit-ChangeByte-ChangeBinInt-PersAutoDict- DE: "\000\000\000\000\000\000\000_"- 00:07:35.605 [2024-11-08 04:51:10.458088] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.458114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 #12 NEW cov: 11805 ft: 13405 corp: 5/43b lim: 35 exec/s: 0 rss: 68Mb L: 9/15 MS: 1 ShuffleBytes- 00:07:35.605 [2024-11-08 04:51:10.498200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.498229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 #13 NEW cov: 11805 ft: 13493 corp: 6/52b lim: 35 exec/s: 0 rss: 68Mb L: 9/15 MS: 1 ShuffleBytes- 00:07:35.605 [2024-11-08 04:51:10.538382] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.538409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 #14 NEW cov: 11805 ft: 13632 corp: 7/61b lim: 35 exec/s: 0 rss: 68Mb L: 9/15 MS: 1 ChangeBinInt- 00:07:35.605 [2024-11-08 04:51:10.578633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.578660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 [2024-11-08 04:51:10.578726] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.578740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.605 #15 NEW cov: 11805 ft: 13667 corp: 8/78b lim: 35 exec/s: 0 rss: 68Mb L: 17/17 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000_"- 00:07:35.605 [2024-11-08 04:51:10.618711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.618737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 [2024-11-08 04:51:10.618794] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.618809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.605 #16 NEW cov: 11805 ft: 13703 corp: 9/97b lim: 35 exec/s: 0 rss: 69Mb L: 19/19 MS: 1 CMP- DE: "\001\000\000\000"- 00:07:35.605 [2024-11-08 04:51:10.668743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.668769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 #17 NEW cov: 11805 ft: 13773 corp: 10/106b lim: 35 exec/s: 0 rss: 69Mb L: 9/19 MS: 1 CMP- DE: "\377\377\377Z"- 00:07:35.605 [2024-11-08 04:51:10.709009] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.709035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.605 [2024-11-08 04:51:10.709092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:005f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.605 [2024-11-08 04:51:10.709107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.863 #18 NEW cov: 11805 ft: 13799 corp: 11/120b lim: 35 exec/s: 0 rss: 69Mb L: 14/19 MS: 1 InsertRepeatedBytes- 00:07:35.863 [2024-11-08 04:51:10.749106] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.863 [2024-11-08 04:51:10.749132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.863 [2024-11-08 04:51:10.749191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000c500 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.863 [2024-11-08 04:51:10.749208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.863 #19 NEW cov: 11805 ft: 13920 corp: 12/137b lim: 35 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 ChangeByte- 00:07:35.863 [2024-11-08 04:51:10.789241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.863 [2024-11-08 04:51:10.789266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.863 [2024-11-08 04:51:10.789322] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.789336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.864 #20 NEW cov: 11805 ft: 14014 corp: 13/154b lim: 35 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 ChangeBit- 00:07:35.864 [2024-11-08 04:51:10.829142] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.829167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.864 #21 NEW cov: 11805 ft: 14029 corp: 14/163b lim: 35 exec/s: 0 rss: 69Mb L: 9/19 MS: 1 ChangeBit- 00:07:35.864 [2024-11-08 04:51:10.869417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.869442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.864 [2024-11-08 04:51:10.869500] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00005f0b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.869514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.864 #22 NEW cov: 11805 ft: 14044 corp: 15/180b lim: 35 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000_"- 00:07:35.864 [2024-11-08 04:51:10.909605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.909630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.864 [2024-11-08 04:51:10.909685] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00005f0b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.909699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:35.864 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:35.864 #23 NEW cov: 11828 ft: 14089 corp: 16/197b lim: 35 exec/s: 0 rss: 69Mb L: 17/19 MS: 1 CopyPart- 00:07:35.864 [2024-11-08 04:51:10.949573] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:35.864 [2024-11-08 04:51:10.949598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:35.864 #24 NEW cov: 11828 ft: 14140 corp: 17/206b lim: 35 exec/s: 0 rss: 69Mb L: 9/19 MS: 1 CrossOver- 00:07:36.122 [2024-11-08 04:51:10.989677] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:10.989702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 #25 NEW cov: 11828 ft: 14147 corp: 18/216b lim: 35 exec/s: 0 rss: 69Mb L: 10/19 MS: 1 InsertByte- 00:07:36.122 [2024-11-08 04:51:11.019924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.019949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 [2024-11-08 04:51:11.020004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:ffffc5ff cdw11:5a000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.020019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.122 #26 NEW cov: 11828 ft: 14165 corp: 19/233b lim: 35 exec/s: 26 rss: 69Mb L: 17/19 MS: 1 PersAutoDict- DE: "\377\377\377Z"- 00:07:36.122 [2024-11-08 04:51:11.059908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.059933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 #27 NEW cov: 11828 ft: 14184 corp: 20/242b lim: 35 exec/s: 27 rss: 69Mb L: 9/19 MS: 1 ChangeBinInt- 00:07:36.122 [2024-11-08 04:51:11.100173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.100199] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 [2024-11-08 04:51:11.100256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.100269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.122 #28 NEW cov: 11828 ft: 14268 corp: 21/259b lim: 35 exec/s: 28 rss: 69Mb L: 17/19 MS: 1 ChangeBit- 00:07:36.122 [2024-11-08 04:51:11.140297] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:d2000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.140322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 [2024-11-08 04:51:11.140377] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:005f0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.140392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.122 #29 NEW cov: 11828 ft: 14286 corp: 22/273b lim: 35 exec/s: 29 rss: 69Mb L: 14/19 MS: 1 ChangeByte- 00:07:36.122 [2024-11-08 04:51:11.180423] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200011 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.122 [2024-11-08 04:51:11.180448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.122 [2024-11-08 04:51:11.180502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.123 [2024-11-08 04:51:11.180516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.123 #30 NEW cov: 11828 ft: 14304 corp: 23/290b lim: 35 exec/s: 30 rss: 69Mb L: 17/19 MS: 1 ChangeBinInt- 00:07:36.123 [2024-11-08 04:51:11.220708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.123 [2024-11-08 04:51:11.220734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.123 [2024-11-08 04:51:11.220789] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.123 [2024-11-08 04:51:11.220806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.123 [2024-11-08 04:51:11.220863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.123 [2024-11-08 04:51:11.220877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.381 #31 NEW cov: 11828 ft: 14591 corp: 24/316b lim: 35 exec/s: 31 rss: 70Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:36.381 [2024-11-08 04:51:11.260662] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200011 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.260687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 [2024-11-08 04:51:11.260744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f840000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.260758] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.381 #32 NEW cov: 11828 ft: 14592 corp: 25/333b lim: 35 exec/s: 32 rss: 70Mb L: 17/26 MS: 1 ChangeBit- 00:07:36.381 [2024-11-08 04:51:11.300619] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:ffff0002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.300645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 #33 NEW cov: 11828 ft: 14614 corp: 26/342b lim: 35 exec/s: 33 rss: 70Mb L: 9/26 MS: 1 ChangeBinInt- 00:07:36.381 [2024-11-08 04:51:11.340722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:fffff8ff cdw11:ffff0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.340748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 #34 NEW cov: 11828 ft: 14631 corp: 27/351b lim: 35 exec/s: 34 rss: 70Mb L: 9/26 MS: 1 ChangeBinInt- 00:07:36.381 [2024-11-08 04:51:11.380833] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.380860] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 #35 NEW cov: 11828 ft: 14636 corp: 28/360b lim: 35 exec/s: 35 rss: 70Mb L: 9/26 MS: 1 ChangeBit- 00:07:36.381 [2024-11-08 04:51:11.410885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00010000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.410911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 #36 NEW cov: 11828 ft: 14649 corp: 29/370b lim: 35 exec/s: 36 rss: 70Mb L: 10/26 MS: 1 ChangeBit- 00:07:36.381 [2024-11-08 04:51:11.451027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:ff620003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.381 [2024-11-08 04:51:11.451053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.381 #37 NEW cov: 11828 ft: 14654 corp: 30/380b lim: 35 exec/s: 37 rss: 70Mb L: 10/26 MS: 1 InsertByte- 00:07:36.640 [2024-11-08 04:51:11.491122] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00ff0000 cdw11:ff490002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.491149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 #38 NEW cov: 11828 ft: 14690 corp: 31/389b lim: 35 exec/s: 38 rss: 70Mb L: 9/26 MS: 1 ChangeByte- 00:07:36.640 [2024-11-08 04:51:11.531238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.531264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 #39 NEW cov: 11828 ft: 14692 corp: 32/398b lim: 35 exec/s: 39 rss: 70Mb L: 9/26 MS: 1 ChangeBinInt- 00:07:36.640 [2024-11-08 04:51:11.561350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:20000003 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.561376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 #40 NEW cov: 11828 ft: 14701 corp: 33/407b lim: 35 exec/s: 40 rss: 70Mb L: 9/26 MS: 1 ChangeBit- 00:07:36.640 [2024-11-08 04:51:11.601613] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.601639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 [2024-11-08 04:51:11.601695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.601709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.640 #41 NEW cov: 11828 ft: 14708 corp: 34/424b lim: 35 exec/s: 41 rss: 70Mb L: 17/26 MS: 1 CopyPart- 00:07:36.640 [2024-11-08 04:51:11.641586] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.641611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 #42 NEW cov: 11828 ft: 14713 corp: 35/436b lim: 35 exec/s: 42 rss: 70Mb L: 12/26 MS: 1 CopyPart- 00:07:36.640 [2024-11-08 04:51:11.681690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.681717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 #43 NEW cov: 11828 ft: 14722 corp: 36/446b lim: 35 exec/s: 43 rss: 70Mb L: 10/26 MS: 1 ChangeBit- 00:07:36.640 [2024-11-08 04:51:11.721964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:40000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.721990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.640 [2024-11-08 04:51:11.722047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000c500 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.640 [2024-11-08 04:51:11.722061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.640 #44 NEW cov: 11828 ft: 14756 corp: 37/463b lim: 35 exec/s: 44 rss: 70Mb L: 17/26 MS: 1 ChangeBit- 00:07:36.899 [2024-11-08 04:51:11.761942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00590a00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.761970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 #45 NEW cov: 11828 ft: 14775 corp: 38/473b lim: 35 exec/s: 45 rss: 70Mb L: 10/26 MS: 1 InsertByte- 00:07:36.899 [2024-11-08 04:51:11.802614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.802643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.802700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:a9a9a9a9 cdw11:a9a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.802715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.802768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:a9a9a9a9 cdw11:a9a90001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.802783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.802838] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:7 nsid:0 cdw10:0000a900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.802851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:36.899 #46 NEW cov: 11828 ft: 15162 corp: 39/503b lim: 35 exec/s: 46 rss: 70Mb L: 30/30 MS: 1 CrossOver- 00:07:36.899 [2024-11-08 04:51:11.852526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.852553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.852612] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:0000c500 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.852626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.852681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:6 nsid:0 cdw10:61616161 cdw11:61000002 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.852695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:36.899 #47 NEW cov: 11828 ft: 15176 corp: 40/525b lim: 35 exec/s: 47 rss: 70Mb L: 22/30 MS: 1 InsertRepeatedBytes- 00:07:36.899 [2024-11-08 04:51:11.892307] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.892333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 #48 NEW cov: 11828 ft: 15183 corp: 41/534b lim: 35 exec/s: 48 rss: 70Mb L: 9/30 MS: 1 ShuffleBytes- 00:07:36.899 [2024-11-08 04:51:11.922419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.922445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 #49 NEW cov: 11828 ft: 15233 corp: 42/545b lim: 35 exec/s: 49 rss: 70Mb L: 11/30 MS: 1 InsertByte- 00:07:36.899 [2024-11-08 04:51:11.962647] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.962673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:11.962730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:5f000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:11.962744] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:36.899 #50 NEW cov: 11828 ft: 15254 corp: 43/562b lim: 35 exec/s: 50 rss: 70Mb L: 17/30 MS: 1 ShuffleBytes- 00:07:36.899 [2024-11-08 04:51:12.002773] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00200011 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:12.002799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:36.899 [2024-11-08 04:51:12.002854] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:5f000000 cdw11:00800000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:36.899 [2024-11-08 04:51:12.002867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.158 #51 NEW cov: 11828 ft: 15300 corp: 44/579b lim: 35 exec/s: 51 rss: 70Mb L: 17/30 MS: 1 ShuffleBytes- 00:07:37.158 [2024-11-08 04:51:12.042876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.158 [2024-11-08 04:51:12.042901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.158 [2024-11-08 04:51:12.042957] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO CQ (05) qid:0 cid:5 nsid:0 cdw10:00005f00 cdw11:0b000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.158 [2024-11-08 04:51:12.042971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.158 #52 NEW cov: 11828 ft: 15312 corp: 45/596b lim: 35 exec/s: 26 rss: 70Mb L: 17/30 MS: 1 ShuffleBytes- 00:07:37.158 #52 DONE cov: 11828 ft: 15312 corp: 45/596b lim: 35 exec/s: 26 rss: 70Mb 00:07:37.158 ###### Recommended dictionary. ###### 00:07:37.158 "\000\000\000\000\000\000\000_" # Uses: 3 00:07:37.158 "\001\000\000\000" # Uses: 0 00:07:37.158 "\377\377\377Z" # Uses: 1 00:07:37.158 ###### End of recommended dictionary. ###### 00:07:37.158 Done 52 runs in 2 second(s) 00:07:37.158 04:51:12 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_4.conf 00:07:37.158 04:51:12 -- ../common.sh@72 -- # (( i++ )) 00:07:37.158 04:51:12 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:37.158 04:51:12 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:07:37.158 04:51:12 -- nvmf/run.sh@23 -- # local fuzzer_type=5 00:07:37.158 04:51:12 -- nvmf/run.sh@24 -- # local timen=1 00:07:37.158 04:51:12 -- nvmf/run.sh@25 -- # local core=0x1 00:07:37.158 04:51:12 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:37.158 04:51:12 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_5.conf 00:07:37.158 04:51:12 -- nvmf/run.sh@29 -- # printf %02d 5 00:07:37.158 04:51:12 -- nvmf/run.sh@29 -- # port=4405 00:07:37.158 04:51:12 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:37.158 04:51:12 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' 00:07:37.158 04:51:12 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4405"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:37.158 04:51:12 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4405' -c /tmp/fuzz_json_5.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 -Z 5 -r /var/tmp/spdk5.sock 00:07:37.158 [2024-11-08 04:51:12.238133] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:37.158 [2024-11-08 04:51:12.238220] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3681033 ] 00:07:37.417 EAL: No free 2048 kB hugepages reported on node 1 00:07:37.417 [2024-11-08 04:51:12.414318] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.417 [2024-11-08 04:51:12.481050] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:37.417 [2024-11-08 04:51:12.481174] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:37.675 [2024-11-08 04:51:12.539206] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:37.675 [2024-11-08 04:51:12.555537] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4405 *** 00:07:37.675 INFO: Running with entropic power schedule (0xFF, 100). 00:07:37.675 INFO: Seed: 258831781 00:07:37.675 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:37.675 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:37.675 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_5 00:07:37.675 INFO: A corpus is not provided, starting from an empty corpus 00:07:37.675 #2 INITED exec/s: 0 rss: 60Mb 00:07:37.675 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:37.675 This may also happen if the target rejected all inputs we tried so far 00:07:37.675 [2024-11-08 04:51:12.624646] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.675 [2024-11-08 04:51:12.624682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.933 NEW_FUNC[1/670]: 0x442af8 in fuzz_admin_create_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:142 00:07:37.933 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:37.933 #6 NEW cov: 11604 ft: 11613 corp: 2/10b lim: 45 exec/s: 0 rss: 68Mb L: 9/9 MS: 4 ChangeByte-CrossOver-ShuffleBytes-CMP- DE: "\014\000\000\000\000\000\000\000"- 00:07:37.933 [2024-11-08 04:51:12.945171] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.933 [2024-11-08 04:51:12.945210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.933 NEW_FUNC[1/1]: 0x1ca82d8 in msg_queue_run_batch /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:810 00:07:37.933 #7 NEW cov: 11725 ft: 12290 corp: 3/27b lim: 45 exec/s: 0 rss: 69Mb L: 17/17 MS: 1 CopyPart- 00:07:37.933 [2024-11-08 04:51:12.996016] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000c800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.933 [2024-11-08 04:51:12.996047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:37.933 [2024-11-08 04:51:12.996170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.933 [2024-11-08 04:51:12.996188] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:37.933 [2024-11-08 04:51:12.996301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.933 [2024-11-08 04:51:12.996318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:37.933 #12 NEW cov: 11731 ft: 13251 corp: 4/55b lim: 45 exec/s: 0 rss: 69Mb L: 28/28 MS: 5 CopyPart-InsertByte-ChangeBinInt-EraseBytes-InsertRepeatedBytes- 00:07:37.933 [2024-11-08 04:51:13.035644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:37.933 [2024-11-08 04:51:13.035670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.191 #13 NEW cov: 11816 ft: 13539 corp: 5/64b lim: 45 exec/s: 0 rss: 69Mb L: 9/28 MS: 1 CopyPart- 00:07:38.191 [2024-11-08 04:51:13.075964] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:99999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.191 [2024-11-08 04:51:13.075991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.191 [2024-11-08 04:51:13.076117] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.191 [2024-11-08 04:51:13.076134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.191 #14 NEW cov: 11816 ft: 13836 corp: 6/84b lim: 45 exec/s: 0 rss: 69Mb L: 20/28 MS: 1 InsertRepeatedBytes- 00:07:38.191 [2024-11-08 04:51:13.115857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.191 [2024-11-08 04:51:13.115884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.191 #15 NEW cov: 11816 ft: 13899 corp: 7/101b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.192 [2024-11-08 04:51:13.166057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:000a0c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.192 [2024-11-08 04:51:13.166083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.192 #16 NEW cov: 11816 ft: 14004 corp: 8/110b lim: 45 exec/s: 0 rss: 69Mb L: 9/28 MS: 1 CrossOver- 00:07:38.192 [2024-11-08 04:51:13.206217] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.192 [2024-11-08 04:51:13.206244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.192 #17 NEW cov: 11816 ft: 14058 corp: 9/127b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 ShuffleBytes- 00:07:38.192 [2024-11-08 04:51:13.256331] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.192 [2024-11-08 04:51:13.256359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.192 #18 NEW cov: 11816 ft: 14139 corp: 10/144b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.192 [2024-11-08 04:51:13.296687] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.192 [2024-11-08 04:51:13.296714] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.192 [2024-11-08 04:51:13.296825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.192 [2024-11-08 04:51:13.296841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.450 #19 NEW cov: 11816 ft: 14196 corp: 11/164b lim: 45 exec/s: 0 rss: 69Mb L: 20/28 MS: 1 ChangeByte- 00:07:38.450 [2024-11-08 04:51:13.346644] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.346671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.450 #20 NEW cov: 11816 ft: 14221 corp: 12/181b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.450 [2024-11-08 04:51:13.386695] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.386721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.450 #21 NEW cov: 11816 ft: 14237 corp: 13/198b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 ShuffleBytes- 00:07:38.450 [2024-11-08 04:51:13.426826] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000d00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.426852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.450 #22 NEW cov: 11816 ft: 14256 corp: 14/207b lim: 45 exec/s: 0 rss: 69Mb L: 9/28 MS: 1 ChangeBit- 00:07:38.450 [2024-11-08 04:51:13.466858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.466884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.450 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:38.450 #23 NEW cov: 11839 ft: 14285 corp: 15/224b lim: 45 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 ShuffleBytes- 00:07:38.450 [2024-11-08 04:51:13.507350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.507376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.450 [2024-11-08 04:51:13.507487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.507504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.450 #24 NEW cov: 11839 ft: 14293 corp: 16/244b lim: 45 exec/s: 0 rss: 70Mb L: 20/28 MS: 1 ShuffleBytes- 00:07:38.450 [2024-11-08 04:51:13.547146] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.450 [2024-11-08 04:51:13.547172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.708 #25 NEW cov: 11839 ft: 14349 corp: 17/261b lim: 45 exec/s: 0 rss: 70Mb L: 17/28 MS: 1 CrossOver- 00:07:38.708 [2024-11-08 04:51:13.587244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.708 [2024-11-08 04:51:13.587272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.708 [2024-11-08 04:51:13.627469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.627495] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.709 #27 NEW cov: 11839 ft: 14362 corp: 18/274b lim: 45 exec/s: 27 rss: 70Mb L: 13/28 MS: 2 PersAutoDict-EraseBytes- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.709 [2024-11-08 04:51:13.667496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.667525] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.709 #33 NEW cov: 11839 ft: 14378 corp: 19/291b lim: 45 exec/s: 33 rss: 70Mb L: 17/28 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.709 [2024-11-08 04:51:13.698476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.698500] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.709 [2024-11-08 04:51:13.698608] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:99999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.698626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.709 [2024-11-08 04:51:13.698734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.698748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.709 [2024-11-08 04:51:13.698855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.698869] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.709 #34 NEW cov: 11839 ft: 14722 corp: 20/327b lim: 45 exec/s: 34 rss: 70Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:38.709 [2024-11-08 04:51:13.747780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.747804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.709 #35 NEW cov: 11839 ft: 14734 corp: 21/344b lim: 45 exec/s: 35 rss: 70Mb L: 17/36 MS: 1 ChangeByte- 00:07:38.709 [2024-11-08 04:51:13.788137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.788161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.709 [2024-11-08 04:51:13.788273] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.709 [2024-11-08 04:51:13.788289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.709 #36 NEW cov: 11839 ft: 14735 corp: 22/369b lim: 45 exec/s: 36 rss: 70Mb L: 25/36 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:38.967 [2024-11-08 04:51:13.827997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:0a0c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.828023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 #39 NEW cov: 11839 ft: 14766 corp: 23/384b lim: 45 exec/s: 39 rss: 70Mb L: 15/36 MS: 3 EraseBytes-CopyPart-InsertRepeatedBytes- 00:07:38.967 [2024-11-08 04:51:13.868352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.868379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:13.868491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:35000000 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.868505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.967 #40 NEW cov: 11839 ft: 14767 corp: 24/409b lim: 45 exec/s: 40 rss: 70Mb L: 25/36 MS: 1 ChangeByte- 00:07:38.967 [2024-11-08 04:51:13.908321] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.908346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 #41 NEW cov: 11839 ft: 14784 corp: 25/426b lim: 45 exec/s: 41 rss: 70Mb L: 17/36 MS: 1 ShuffleBytes- 00:07:38.967 [2024-11-08 04:51:13.948683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.948711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:13.948823] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.948838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.967 #42 NEW cov: 11839 ft: 14789 corp: 26/444b lim: 45 exec/s: 42 rss: 70Mb L: 18/36 MS: 1 InsertByte- 00:07:38.967 [2024-11-08 04:51:13.988832] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24999999 cdw11:99990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.988857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:13.988965] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffff0004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:13.988980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:14.029515] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:24999999 cdw11:990c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:14.029545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:14.029681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:14.029696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:14.029812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:000a0000 cdw11:99990007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:14.029828] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:38.967 [2024-11-08 04:51:14.029946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff990004 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:14.029961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:38.967 #44 NEW cov: 11839 ft: 14799 corp: 27/481b lim: 45 exec/s: 44 rss: 70Mb L: 37/37 MS: 2 CMP-CrossOver- DE: "\377\377\377\377\377\377\377\377"- 00:07:38.967 [2024-11-08 04:51:14.068772] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00650000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:38.967 [2024-11-08 04:51:14.068799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 #45 NEW cov: 11839 ft: 14824 corp: 28/491b lim: 45 exec/s: 45 rss: 70Mb L: 10/37 MS: 1 InsertByte- 00:07:39.226 [2024-11-08 04:51:14.108951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.108976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 #46 NEW cov: 11839 ft: 14857 corp: 29/505b lim: 45 exec/s: 46 rss: 70Mb L: 14/37 MS: 1 CopyPart- 00:07:39.226 [2024-11-08 04:51:14.148998] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.149025] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 #47 NEW cov: 11839 ft: 14867 corp: 30/518b lim: 45 exec/s: 47 rss: 70Mb L: 13/37 MS: 1 ChangeByte- 00:07:39.226 [2024-11-08 04:51:14.189976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.190003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 [2024-11-08 04:51:14.190129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:35000000 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.190146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.226 [2024-11-08 04:51:14.190251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.190266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.226 [2024-11-08 04:51:14.190379] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.190394] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:39.226 #48 NEW cov: 11839 ft: 14886 corp: 31/561b lim: 45 exec/s: 48 rss: 70Mb L: 43/43 MS: 1 InsertRepeatedBytes- 00:07:39.226 [2024-11-08 04:51:14.239575] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.239603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 [2024-11-08 04:51:14.239713] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:35000000 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.239729] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.226 #49 NEW cov: 11839 ft: 14894 corp: 32/586b lim: 45 exec/s: 49 rss: 70Mb L: 25/43 MS: 1 PersAutoDict- DE: "\377\377\377\377\377\377\377\377"- 00:07:39.226 [2024-11-08 04:51:14.279440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.279466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.226 #50 NEW cov: 11839 ft: 14909 corp: 33/603b lim: 45 exec/s: 50 rss: 70Mb L: 17/43 MS: 1 ChangeBinInt- 00:07:39.226 [2024-11-08 04:51:14.319562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.226 [2024-11-08 04:51:14.319588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 #51 NEW cov: 11839 ft: 14923 corp: 34/617b lim: 45 exec/s: 51 rss: 70Mb L: 14/43 MS: 1 InsertByte- 00:07:39.485 [2024-11-08 04:51:14.359899] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.359926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.360039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:ffff0000 cdw11:ffff0007 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.360055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.485 #52 NEW cov: 11839 ft: 14936 corp: 35/641b lim: 45 exec/s: 52 rss: 70Mb L: 24/43 MS: 1 InsertRepeatedBytes- 00:07:39.485 [2024-11-08 04:51:14.400359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.400386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.400502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:35000000 cdw11:000c0000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.400517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.400637] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.400653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.485 #53 NEW cov: 11839 ft: 14941 corp: 36/674b lim: 45 exec/s: 53 rss: 70Mb L: 33/43 MS: 1 PersAutoDict- DE: "\014\000\000\000\000\000\000\000"- 00:07:39.485 [2024-11-08 04:51:14.449919] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.449946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 #54 NEW cov: 11839 ft: 14951 corp: 37/683b lim: 45 exec/s: 54 rss: 70Mb L: 9/43 MS: 1 ShuffleBytes- 00:07:39.485 [2024-11-08 04:51:14.490664] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:0000c800 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.490691] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.490800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00000900 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.490816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.490924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.490939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.485 #55 NEW cov: 11839 ft: 14977 corp: 38/711b lim: 45 exec/s: 55 rss: 70Mb L: 28/43 MS: 1 ChangeBinInt- 00:07:39.485 [2024-11-08 04:51:14.540395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c0c cdw11:00000001 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.540422] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.485 [2024-11-08 04:51:14.540543] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:00240000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.540560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.485 #56 NEW cov: 11839 ft: 15031 corp: 39/729b lim: 45 exec/s: 56 rss: 70Mb L: 18/43 MS: 1 ChangeByte- 00:07:39.485 [2024-11-08 04:51:14.590837] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:4 nsid:0 cdw10:00000c00 cdw11:0c000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.485 [2024-11-08 04:51:14.590863] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:39.486 [2024-11-08 04:51:14.590976] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.486 [2024-11-08 04:51:14.590996] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:39.486 [2024-11-08 04:51:14.591101] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: CREATE IO SQ (01) qid:0 cid:6 nsid:0 cdw10:00004100 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:39.486 [2024-11-08 04:51:14.591116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:39.744 #57 NEW cov: 11839 ft: 15048 corp: 40/760b lim: 45 exec/s: 28 rss: 70Mb L: 31/43 MS: 1 CopyPart- 00:07:39.744 #57 DONE cov: 11839 ft: 15048 corp: 40/760b lim: 45 exec/s: 28 rss: 70Mb 00:07:39.744 ###### Recommended dictionary. ###### 00:07:39.744 "\014\000\000\000\000\000\000\000" # Uses: 7 00:07:39.744 "\377\377\377\377\377\377\377\377" # Uses: 1 00:07:39.744 ###### End of recommended dictionary. ###### 00:07:39.744 Done 57 runs in 2 second(s) 00:07:39.744 04:51:14 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_5.conf 00:07:39.744 04:51:14 -- ../common.sh@72 -- # (( i++ )) 00:07:39.745 04:51:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:39.745 04:51:14 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:07:39.745 04:51:14 -- nvmf/run.sh@23 -- # local fuzzer_type=6 00:07:39.745 04:51:14 -- nvmf/run.sh@24 -- # local timen=1 00:07:39.745 04:51:14 -- nvmf/run.sh@25 -- # local core=0x1 00:07:39.745 04:51:14 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:39.745 04:51:14 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_6.conf 00:07:39.745 04:51:14 -- nvmf/run.sh@29 -- # printf %02d 6 00:07:39.745 04:51:14 -- nvmf/run.sh@29 -- # port=4406 00:07:39.745 04:51:14 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:39.745 04:51:14 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' 00:07:39.745 04:51:14 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4406"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:39.745 04:51:14 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4406' -c /tmp/fuzz_json_6.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 -Z 6 -r /var/tmp/spdk6.sock 00:07:39.745 [2024-11-08 04:51:14.787641] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:39.745 [2024-11-08 04:51:14.787709] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3681465 ] 00:07:39.745 EAL: No free 2048 kB hugepages reported on node 1 00:07:40.003 [2024-11-08 04:51:14.966390] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.003 [2024-11-08 04:51:15.029578] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:40.003 [2024-11-08 04:51:15.029703] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:40.003 [2024-11-08 04:51:15.087700] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:40.003 [2024-11-08 04:51:15.104027] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4406 *** 00:07:40.261 INFO: Running with entropic power schedule (0xFF, 100). 00:07:40.261 INFO: Seed: 2807840495 00:07:40.261 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:40.261 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:40.261 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_6 00:07:40.261 INFO: A corpus is not provided, starting from an empty corpus 00:07:40.261 #2 INITED exec/s: 0 rss: 61Mb 00:07:40.261 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:40.261 This may also happen if the target rejected all inputs we tried so far 00:07:40.261 [2024-11-08 04:51:15.162926] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d0b cdw11:00000000 00:07:40.261 [2024-11-08 04:51:15.162961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.519 NEW_FUNC[1/669]: 0x445308 in fuzz_admin_delete_io_completion_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:161 00:07:40.519 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:40.519 #7 NEW cov: 11528 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 5 ShuffleBytes-ShuffleBytes-ChangeBit-ShuffleBytes-InsertByte- 00:07:40.519 [2024-11-08 04:51:15.483876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009db5 cdw11:00000000 00:07:40.519 [2024-11-08 04:51:15.483933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.519 #8 NEW cov: 11642 ft: 12155 corp: 3/5b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeByte- 00:07:40.519 [2024-11-08 04:51:15.533781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d9d cdw11:00000000 00:07:40.519 [2024-11-08 04:51:15.533807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.519 #9 NEW cov: 11648 ft: 12353 corp: 4/8b lim: 10 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 CopyPart- 00:07:40.519 [2024-11-08 04:51:15.573875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:40.519 [2024-11-08 04:51:15.573901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.519 #10 NEW cov: 11733 ft: 12727 corp: 5/10b lim: 10 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 CopyPart- 00:07:40.519 [2024-11-08 04:51:15.613967] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f6f5 cdw11:00000000 00:07:40.519 [2024-11-08 04:51:15.613992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.777 #11 NEW cov: 11733 ft: 12862 corp: 6/12b lim: 10 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBinInt- 00:07:40.777 [2024-11-08 04:51:15.654344] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:40.777 [2024-11-08 04:51:15.654370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.777 [2024-11-08 04:51:15.654421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:07:40.777 [2024-11-08 04:51:15.654434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.777 [2024-11-08 04:51:15.654483] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:07:40.777 [2024-11-08 04:51:15.654497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.777 #12 NEW cov: 11733 ft: 13177 corp: 7/19b lim: 10 exec/s: 0 rss: 69Mb L: 7/7 MS: 1 InsertRepeatedBytes- 00:07:40.777 [2024-11-08 04:51:15.694206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f4f5 cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.694231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.778 #18 NEW cov: 11733 ft: 13311 corp: 8/21b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:40.778 [2024-11-08 04:51:15.734400] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009db5 cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.734425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.778 #19 NEW cov: 11733 ft: 13346 corp: 9/23b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 CopyPart- 00:07:40.778 [2024-11-08 04:51:15.774494] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dd0b cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.774520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.778 #20 NEW cov: 11733 ft: 13423 corp: 10/25b lim: 10 exec/s: 0 rss: 69Mb L: 2/7 MS: 1 ChangeBit- 00:07:40.778 [2024-11-08 04:51:15.814951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.814976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.778 [2024-11-08 04:51:15.815027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.815040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:40.778 [2024-11-08 04:51:15.815092] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.815105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:40.778 [2024-11-08 04:51:15.815153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.815166] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:40.778 #25 NEW cov: 11733 ft: 13653 corp: 11/34b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 5 ChangeByte-ChangeByte-ChangeBinInt-ChangeBinInt-InsertRepeatedBytes- 00:07:40.778 [2024-11-08 04:51:15.854709] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d0b cdw11:00000000 00:07:40.778 [2024-11-08 04:51:15.854735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:40.778 #26 NEW cov: 11733 ft: 13709 corp: 12/37b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 InsertByte- 00:07:41.036 [2024-11-08 04:51:15.894864] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009df4 cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.894891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.036 #27 NEW cov: 11733 ft: 13785 corp: 13/39b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 CrossOver- 00:07:41.036 [2024-11-08 04:51:15.934968] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d03 cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.934994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.036 #28 NEW cov: 11733 ft: 13812 corp: 14/41b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 1 ChangeBit- 00:07:41.036 [2024-11-08 04:51:15.975392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.975418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.036 [2024-11-08 04:51:15.975469] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.975481] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.036 [2024-11-08 04:51:15.975534] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.975547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.036 [2024-11-08 04:51:15.975597] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:15.975615] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.036 #29 NEW cov: 11733 ft: 13868 corp: 15/50b lim: 10 exec/s: 0 rss: 69Mb L: 9/9 MS: 1 ShuffleBytes- 00:07:41.036 [2024-11-08 04:51:16.015432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:16.015459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.036 [2024-11-08 04:51:16.015511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:16.015530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.036 [2024-11-08 04:51:16.015579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.036 [2024-11-08 04:51:16.015593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.036 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:41.036 #30 NEW cov: 11756 ft: 13925 corp: 16/57b lim: 10 exec/s: 0 rss: 69Mb L: 7/9 MS: 1 EraseBytes- 00:07:41.036 [2024-11-08 04:51:16.055324] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f4f5 cdw11:00000000 00:07:41.036 [2024-11-08 04:51:16.055351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.036 #31 NEW cov: 11756 ft: 13937 corp: 17/60b lim: 10 exec/s: 0 rss: 69Mb L: 3/9 MS: 1 CrossOver- 00:07:41.036 [2024-11-08 04:51:16.095428] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00003a3a cdw11:00000000 00:07:41.036 [2024-11-08 04:51:16.095454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.037 #35 NEW cov: 11756 ft: 13942 corp: 18/62b lim: 10 exec/s: 0 rss: 69Mb L: 2/9 MS: 4 EraseBytes-CopyPart-ChangeByte-CopyPart- 00:07:41.037 [2024-11-08 04:51:16.135559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009df5 cdw11:00000000 00:07:41.037 [2024-11-08 04:51:16.135585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 #36 NEW cov: 11756 ft: 13958 corp: 19/64b lim: 10 exec/s: 36 rss: 69Mb L: 2/9 MS: 1 CrossOver- 00:07:41.295 [2024-11-08 04:51:16.175918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.175944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.175994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.176007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.176058] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.176072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.295 #37 NEW cov: 11756 ft: 14000 corp: 20/71b lim: 10 exec/s: 37 rss: 69Mb L: 7/9 MS: 1 ChangeByte- 00:07:41.295 [2024-11-08 04:51:16.215935] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae4 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.215961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.216011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.216028] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.295 #38 NEW cov: 11756 ft: 14152 corp: 21/75b lim: 10 exec/s: 38 rss: 69Mb L: 4/9 MS: 1 EraseBytes- 00:07:41.295 [2024-11-08 04:51:16.256149] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.256174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.256226] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00002ae4 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.256239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.256289] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.256303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.295 #39 NEW cov: 11756 ft: 14173 corp: 22/82b lim: 10 exec/s: 39 rss: 69Mb L: 7/9 MS: 1 ChangeByte- 00:07:41.295 [2024-11-08 04:51:16.296134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.296161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.296215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.296229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.295 #40 NEW cov: 11756 ft: 14210 corp: 23/87b lim: 10 exec/s: 40 rss: 69Mb L: 5/9 MS: 1 InsertRepeatedBytes- 00:07:41.295 [2024-11-08 04:51:16.336487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00004848 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.336512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.336569] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00004848 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.336584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.336633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00004848 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.336647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.295 [2024-11-08 04:51:16.336696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:000048f4 cdw11:00000000 00:07:41.295 [2024-11-08 04:51:16.336709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.295 #41 NEW cov: 11756 ft: 14273 corp: 24/96b lim: 10 exec/s: 41 rss: 69Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:41.295 [2024-11-08 04:51:16.376262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a3a cdw11:00000000 00:07:41.296 [2024-11-08 04:51:16.376288] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.296 #42 NEW cov: 11756 ft: 14310 corp: 25/98b lim: 10 exec/s: 42 rss: 70Mb L: 2/9 MS: 1 ChangeBit- 00:07:41.617 [2024-11-08 04:51:16.416614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009df4 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.416640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.416691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000ae4 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.416705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.416756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000e4e4 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.416769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.617 #43 NEW cov: 11756 ft: 14328 corp: 26/104b lim: 10 exec/s: 43 rss: 70Mb L: 6/9 MS: 1 CrossOver- 00:07:41.617 [2024-11-08 04:51:16.456480] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dd0b cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.456506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 #44 NEW cov: 11756 ft: 14345 corp: 27/106b lim: 10 exec/s: 44 rss: 70Mb L: 2/9 MS: 1 CopyPart- 00:07:41.617 [2024-11-08 04:51:16.496577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d9d cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.496603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 #47 NEW cov: 11756 ft: 14374 corp: 28/109b lim: 10 exec/s: 47 rss: 70Mb L: 3/9 MS: 3 ChangeByte-ChangeByte-CrossOver- 00:07:41.617 [2024-11-08 04:51:16.526780] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000d7d7 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.526805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.526857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000d7f6 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.526870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.617 #49 NEW cov: 11756 ft: 14395 corp: 29/113b lim: 10 exec/s: 49 rss: 70Mb L: 4/9 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:41.617 [2024-11-08 04:51:16.567027] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.567052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.567102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ff2c cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.567116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.567166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.567179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.617 #50 NEW cov: 11756 ft: 14422 corp: 30/120b lim: 10 exec/s: 50 rss: 70Mb L: 7/9 MS: 1 ChangeByte- 00:07:41.617 [2024-11-08 04:51:16.607250] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000601 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.607276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.607327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.607340] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.607390] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.607404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.607456] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.607469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.617 #53 NEW cov: 11756 ft: 14445 corp: 31/129b lim: 10 exec/s: 53 rss: 70Mb L: 9/9 MS: 3 EraseBytes-ChangeByte-CMP- DE: "\001\000\000\000\000\000\000\001"- 00:07:41.617 [2024-11-08 04:51:16.647251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000ae4 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.647276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.647328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000e49d cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.647341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.617 [2024-11-08 04:51:16.647391] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000b5e4 cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.647404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.617 #54 NEW cov: 11756 ft: 14459 corp: 32/135b lim: 10 exec/s: 54 rss: 70Mb L: 6/9 MS: 1 CrossOver- 00:07:41.617 [2024-11-08 04:51:16.687207] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00000bdd cdw11:00000000 00:07:41.617 [2024-11-08 04:51:16.687233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.876 #55 NEW cov: 11756 ft: 14463 corp: 33/137b lim: 10 exec/s: 55 rss: 70Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:41.876 [2024-11-08 04:51:16.727271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002cdd cdw11:00000000 00:07:41.876 [2024-11-08 04:51:16.727297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.876 #56 NEW cov: 11756 ft: 14469 corp: 34/140b lim: 10 exec/s: 56 rss: 70Mb L: 3/9 MS: 1 InsertByte- 00:07:41.876 [2024-11-08 04:51:16.767384] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000dd0b cdw11:00000000 00:07:41.876 [2024-11-08 04:51:16.767410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.876 #57 NEW cov: 11756 ft: 14477 corp: 35/142b lim: 10 exec/s: 57 rss: 70Mb L: 2/9 MS: 1 ShuffleBytes- 00:07:41.876 [2024-11-08 04:51:16.797481] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:0000f4f5 cdw11:00000000 00:07:41.876 [2024-11-08 04:51:16.797506] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.876 #58 NEW cov: 11756 ft: 14514 corp: 36/145b lim: 10 exec/s: 58 rss: 70Mb L: 3/9 MS: 1 ShuffleBytes- 00:07:41.877 [2024-11-08 04:51:16.837748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.837772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.877 [2024-11-08 04:51:16.837825] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.837838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.877 #59 NEW cov: 11756 ft: 14537 corp: 37/149b lim: 10 exec/s: 59 rss: 70Mb L: 4/9 MS: 1 EraseBytes- 00:07:41.877 [2024-11-08 04:51:16.878077] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:000001ff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.878105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.877 [2024-11-08 04:51:16.878158] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.878171] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.877 [2024-11-08 04:51:16.878220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.878234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:41.877 [2024-11-08 04:51:16.878282] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000c2ff cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.878295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:41.877 #60 NEW cov: 11756 ft: 14550 corp: 38/158b lim: 10 exec/s: 60 rss: 70Mb L: 9/9 MS: 1 ChangeByte- 00:07:41.877 [2024-11-08 04:51:16.917980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d0b cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.918006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.877 [2024-11-08 04:51:16.918057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:00009d9d cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.918072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:41.877 #61 NEW cov: 11756 ft: 14571 corp: 39/163b lim: 10 exec/s: 61 rss: 70Mb L: 5/9 MS: 1 CrossOver- 00:07:41.877 [2024-11-08 04:51:16.958007] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002a3a cdw11:00000000 00:07:41.877 [2024-11-08 04:51:16.958033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:41.877 #62 NEW cov: 11756 ft: 14596 corp: 40/166b lim: 10 exec/s: 62 rss: 70Mb L: 3/9 MS: 1 CopyPart- 00:07:42.136 [2024-11-08 04:51:16.998103] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00002af5 cdw11:00000000 00:07:42.136 [2024-11-08 04:51:16.998128] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.136 #63 NEW cov: 11756 ft: 14607 corp: 41/168b lim: 10 exec/s: 63 rss: 70Mb L: 2/9 MS: 1 CrossOver- 00:07:42.136 [2024-11-08 04:51:17.028201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d0b cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.028226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.136 #64 NEW cov: 11756 ft: 14666 corp: 42/171b lim: 10 exec/s: 64 rss: 70Mb L: 3/9 MS: 1 ShuffleBytes- 00:07:42.136 [2024-11-08 04:51:17.068691] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d9d cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.068716] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.136 [2024-11-08 04:51:17.068768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000b5f2 cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.068782] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.136 [2024-11-08 04:51:17.068835] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:0000f2f2 cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.068848] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.136 [2024-11-08 04:51:17.068900] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:7 nsid:0 cdw10:0000f2f2 cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.068915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:42.136 #65 NEW cov: 11756 ft: 14676 corp: 43/180b lim: 10 exec/s: 65 rss: 70Mb L: 9/9 MS: 1 InsertRepeatedBytes- 00:07:42.136 [2024-11-08 04:51:17.108634] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d9d cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.108659] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.136 [2024-11-08 04:51:17.108710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:5 nsid:0 cdw10:0000269d cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.108723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:42.136 [2024-11-08 04:51:17.108774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:6 nsid:0 cdw10:00009d26 cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.108787] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:42.136 #66 NEW cov: 11756 ft: 14687 corp: 44/186b lim: 10 exec/s: 66 rss: 70Mb L: 6/9 MS: 1 CopyPart- 00:07:42.136 [2024-11-08 04:51:17.148503] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO CQ (04) qid:0 cid:4 nsid:0 cdw10:00009d2d cdw11:00000000 00:07:42.136 [2024-11-08 04:51:17.148531] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.136 #67 NEW cov: 11756 ft: 14697 corp: 45/189b lim: 10 exec/s: 33 rss: 70Mb L: 3/9 MS: 1 InsertByte- 00:07:42.136 #67 DONE cov: 11756 ft: 14697 corp: 45/189b lim: 10 exec/s: 33 rss: 70Mb 00:07:42.136 ###### Recommended dictionary. ###### 00:07:42.136 "\001\000\000\000\000\000\000\001" # Uses: 0 00:07:42.136 ###### End of recommended dictionary. ###### 00:07:42.136 Done 67 runs in 2 second(s) 00:07:42.394 04:51:17 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_6.conf 00:07:42.394 04:51:17 -- ../common.sh@72 -- # (( i++ )) 00:07:42.394 04:51:17 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:42.394 04:51:17 -- ../common.sh@73 -- # start_llvm_fuzz 7 1 0x1 00:07:42.394 04:51:17 -- nvmf/run.sh@23 -- # local fuzzer_type=7 00:07:42.394 04:51:17 -- nvmf/run.sh@24 -- # local timen=1 00:07:42.394 04:51:17 -- nvmf/run.sh@25 -- # local core=0x1 00:07:42.394 04:51:17 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:42.394 04:51:17 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_7.conf 00:07:42.394 04:51:17 -- nvmf/run.sh@29 -- # printf %02d 7 00:07:42.394 04:51:17 -- nvmf/run.sh@29 -- # port=4407 00:07:42.395 04:51:17 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:42.395 04:51:17 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' 00:07:42.395 04:51:17 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4407"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:42.395 04:51:17 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4407' -c /tmp/fuzz_json_7.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 -Z 7 -r /var/tmp/spdk7.sock 00:07:42.395 [2024-11-08 04:51:17.331558] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:42.395 [2024-11-08 04:51:17.331635] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3681866 ] 00:07:42.395 EAL: No free 2048 kB hugepages reported on node 1 00:07:42.654 [2024-11-08 04:51:17.511162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:42.654 [2024-11-08 04:51:17.576102] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:42.654 [2024-11-08 04:51:17.576228] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:42.654 [2024-11-08 04:51:17.634534] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:42.654 [2024-11-08 04:51:17.650881] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4407 *** 00:07:42.654 INFO: Running with entropic power schedule (0xFF, 100). 00:07:42.654 INFO: Seed: 1060869339 00:07:42.654 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:42.654 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:42.654 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_7 00:07:42.654 INFO: A corpus is not provided, starting from an empty corpus 00:07:42.654 #2 INITED exec/s: 0 rss: 60Mb 00:07:42.654 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:42.654 This may also happen if the target rejected all inputs we tried so far 00:07:42.654 [2024-11-08 04:51:17.696001] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:42.654 [2024-11-08 04:51:17.696030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.913 NEW_FUNC[1/669]: 0x445d08 in fuzz_admin_delete_io_submission_queue_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:172 00:07:42.913 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:42.913 #8 NEW cov: 11529 ft: 11530 corp: 2/3b lim: 10 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:42.913 [2024-11-08 04:51:17.996822] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000404 cdw11:00000000 00:07:42.913 [2024-11-08 04:51:17.996853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:42.913 [2024-11-08 04:51:17.996906] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:42.913 [2024-11-08 04:51:17.996920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 #9 NEW cov: 11642 ft: 12235 corp: 3/8b lim: 10 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:43.173 [2024-11-08 04:51:18.046903] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.046928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.046980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000404 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.046993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 #10 NEW cov: 11648 ft: 12570 corp: 4/12b lim: 10 exec/s: 0 rss: 69Mb L: 4/5 MS: 1 CrossOver- 00:07:43.173 [2024-11-08 04:51:18.087124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.087150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.087200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.087214] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.087266] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006004 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.087281] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.173 #11 NEW cov: 11733 ft: 12989 corp: 5/18b lim: 10 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 CrossOver- 00:07:43.173 [2024-11-08 04:51:18.127126] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000404 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.127151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.127202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.127215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 #12 NEW cov: 11733 ft: 13106 corp: 6/23b lim: 10 exec/s: 0 rss: 69Mb L: 5/6 MS: 1 ChangeByte- 00:07:43.173 [2024-11-08 04:51:18.167371] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a60 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.167395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.167447] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.167460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.167510] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006004 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.167528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.173 #13 NEW cov: 11733 ft: 13201 corp: 7/29b lim: 10 exec/s: 0 rss: 69Mb L: 6/6 MS: 1 ChangeBit- 00:07:43.173 [2024-11-08 04:51:18.207368] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.207392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 [2024-11-08 04:51:18.207443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.207456] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.173 #16 NEW cov: 11733 ft: 13275 corp: 8/34b lim: 10 exec/s: 0 rss: 69Mb L: 5/6 MS: 3 ChangeByte-ShuffleBytes-InsertRepeatedBytes- 00:07:43.173 [2024-11-08 04:51:18.247357] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:43.173 [2024-11-08 04:51:18.247381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.173 #17 NEW cov: 11733 ft: 13366 corp: 9/37b lim: 10 exec/s: 0 rss: 69Mb L: 3/6 MS: 1 InsertByte- 00:07:43.433 [2024-11-08 04:51:18.287937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.287961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.288011] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.288026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.288075] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.288088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.288139] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000ffff cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.288152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.288206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000ff0a cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.288220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.433 #19 NEW cov: 11733 ft: 13623 corp: 10/47b lim: 10 exec/s: 0 rss: 69Mb L: 10/10 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:07:43.433 [2024-11-08 04:51:18.327574] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.327609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 #20 NEW cov: 11733 ft: 13699 corp: 11/49b lim: 10 exec/s: 0 rss: 69Mb L: 2/10 MS: 1 ShuffleBytes- 00:07:43.433 [2024-11-08 04:51:18.358005] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.358030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.358081] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.358095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.358143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.358156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.358206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.358219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.433 #24 NEW cov: 11733 ft: 13768 corp: 12/58b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 4 ShuffleBytes-CopyPart-ChangeBit-InsertRepeatedBytes- 00:07:43.433 [2024-11-08 04:51:18.398010] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.398034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.398086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000031 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.398099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.398150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.398164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.433 #25 NEW cov: 11733 ft: 13816 corp: 13/64b lim: 10 exec/s: 0 rss: 69Mb L: 6/10 MS: 1 InsertByte- 00:07:43.433 [2024-11-08 04:51:18.438059] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9f cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.438083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.438135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000404 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.438147] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 #26 NEW cov: 11733 ft: 13830 corp: 14/68b lim: 10 exec/s: 0 rss: 69Mb L: 4/10 MS: 1 ChangeBinInt- 00:07:43.433 [2024-11-08 04:51:18.478107] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.478137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.478188] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.478202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 #27 NEW cov: 11733 ft: 13838 corp: 15/73b lim: 10 exec/s: 0 rss: 69Mb L: 5/10 MS: 1 ChangeBit- 00:07:43.433 [2024-11-08 04:51:18.518458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a60 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.518482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.518537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.518551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.518601] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006060 cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.518614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.433 [2024-11-08 04:51:18.518663] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.433 [2024-11-08 04:51:18.518677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.693 #28 NEW cov: 11733 ft: 13844 corp: 16/82b lim: 10 exec/s: 0 rss: 69Mb L: 9/10 MS: 1 CopyPart- 00:07:43.693 [2024-11-08 04:51:18.558261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000260 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.558286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:43.693 #29 NEW cov: 11756 ft: 13868 corp: 17/85b lim: 10 exec/s: 0 rss: 70Mb L: 3/10 MS: 1 ChangeBit- 00:07:43.693 [2024-11-08 04:51:18.598602] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a60 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.598627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.598679] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.598693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.598743] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006004 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.598757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.693 #30 NEW cov: 11756 ft: 13959 corp: 18/91b lim: 10 exec/s: 0 rss: 70Mb L: 6/10 MS: 1 CopyPart- 00:07:43.693 [2024-11-08 04:51:18.638502] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.638532] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 #31 NEW cov: 11756 ft: 14010 corp: 19/94b lim: 10 exec/s: 0 rss: 70Mb L: 3/10 MS: 1 EraseBytes- 00:07:43.693 [2024-11-08 04:51:18.678605] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.678633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 #32 NEW cov: 11756 ft: 14019 corp: 20/97b lim: 10 exec/s: 32 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:43.693 [2024-11-08 04:51:18.718819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000004e5 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.718844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.718895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.718909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.693 #33 NEW cov: 11756 ft: 14064 corp: 21/102b lim: 10 exec/s: 33 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:43.693 [2024-11-08 04:51:18.759160] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a60 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.759185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.759237] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000400 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.759250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.759300] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000960 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.759314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.759361] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.759374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.693 #34 NEW cov: 11756 ft: 14078 corp: 22/111b lim: 10 exec/s: 34 rss: 70Mb L: 9/10 MS: 1 ChangeBinInt- 00:07:43.693 [2024-11-08 04:51:18.799112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9f cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.799137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.693 [2024-11-08 04:51:18.799189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006004 cdw11:00000000 00:07:43.693 [2024-11-08 04:51:18.799202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.952 #35 NEW cov: 11756 ft: 14092 corp: 23/115b lim: 10 exec/s: 35 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:43.953 [2024-11-08 04:51:18.839076] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000260 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.839101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 #36 NEW cov: 11756 ft: 14190 corp: 24/118b lim: 10 exec/s: 36 rss: 70Mb L: 3/10 MS: 1 ShuffleBytes- 00:07:43.953 [2024-11-08 04:51:18.879674] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000004e5 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.879698] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.879752] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.879766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.879814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.879831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.879881] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e104 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.879894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.879945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000afe cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.879959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:43.953 #37 NEW cov: 11756 ft: 14229 corp: 25/128b lim: 10 exec/s: 37 rss: 70Mb L: 10/10 MS: 1 InsertRepeatedBytes- 00:07:43.953 [2024-11-08 04:51:18.919421] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9f cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.919445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.919498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006005 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.919511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.953 #38 NEW cov: 11756 ft: 14252 corp: 26/132b lim: 10 exec/s: 38 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:07:43.953 [2024-11-08 04:51:18.959533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000602c cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.959558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.959607] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000404 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.959620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.953 #39 NEW cov: 11756 ft: 14292 corp: 27/136b lim: 10 exec/s: 39 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:43.953 [2024-11-08 04:51:18.999741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00004a04 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.999767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.999819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.999832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:18.999885] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00006004 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:18.999898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:43.953 #40 NEW cov: 11756 ft: 14325 corp: 28/142b lim: 10 exec/s: 40 rss: 70Mb L: 6/10 MS: 1 CrossOver- 00:07:43.953 [2024-11-08 04:51:19.039727] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000f3a9 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:19.039752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:43.953 [2024-11-08 04:51:19.039805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000040 cdw11:00000000 00:07:43.953 [2024-11-08 04:51:19.039819] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.212 #41 NEW cov: 11756 ft: 14333 corp: 29/146b lim: 10 exec/s: 41 rss: 70Mb L: 4/10 MS: 1 InsertByte- 00:07:44.212 [2024-11-08 04:51:19.079860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00001404 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.079885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.079937] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.079951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.212 #42 NEW cov: 11756 ft: 14355 corp: 30/151b lim: 10 exec/s: 42 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:44.212 [2024-11-08 04:51:19.119858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000025a cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.119883] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 #43 NEW cov: 11756 ft: 14416 corp: 31/154b lim: 10 exec/s: 43 rss: 70Mb L: 3/10 MS: 1 ChangeBinInt- 00:07:44.212 [2024-11-08 04:51:19.160220] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a900 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.160246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.160299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003100 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.160315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.160369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.160384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.212 #44 NEW cov: 11756 ft: 14431 corp: 32/160b lim: 10 exec/s: 44 rss: 70Mb L: 6/10 MS: 1 ShuffleBytes- 00:07:44.212 [2024-11-08 04:51:19.200185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000004e5 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.200210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.200262] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000044a cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.200275] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.212 #45 NEW cov: 11756 ft: 14445 corp: 33/165b lim: 10 exec/s: 45 rss: 70Mb L: 5/10 MS: 1 ChangeBit- 00:07:44.212 [2024-11-08 04:51:19.240202] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000040a cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.240226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 #46 NEW cov: 11756 ft: 14456 corp: 34/168b lim: 10 exec/s: 46 rss: 70Mb L: 3/10 MS: 1 CrossOver- 00:07:44.212 [2024-11-08 04:51:19.280753] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000004e5 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.280778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.280829] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.280843] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.280893] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.280909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.212 [2024-11-08 04:51:19.280960] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000e104 cdw11:00000000 00:07:44.212 [2024-11-08 04:51:19.280973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.213 [2024-11-08 04:51:19.281021] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:00000afe cdw11:00000000 00:07:44.213 [2024-11-08 04:51:19.281035] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.213 #47 NEW cov: 11756 ft: 14482 corp: 35/178b lim: 10 exec/s: 47 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:44.213 [2024-11-08 04:51:19.320531] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000404 cdw11:00000000 00:07:44.213 [2024-11-08 04:51:19.320557] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.213 [2024-11-08 04:51:19.320610] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000040a cdw11:00000000 00:07:44.213 [2024-11-08 04:51:19.320624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #48 NEW cov: 11756 ft: 14494 corp: 36/183b lim: 10 exec/s: 48 rss: 70Mb L: 5/10 MS: 1 ChangeByte- 00:07:44.472 [2024-11-08 04:51:19.360491] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000260 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.360516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 #49 NEW cov: 11756 ft: 14509 corp: 37/186b lim: 10 exec/s: 49 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:44.472 [2024-11-08 04:51:19.400720] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9f cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.400745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.400796] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006005 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.400810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #50 NEW cov: 11756 ft: 14527 corp: 38/190b lim: 10 exec/s: 50 rss: 70Mb L: 4/10 MS: 1 ShuffleBytes- 00:07:44.472 [2024-11-08 04:51:19.440883] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a9f cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.440908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.440961] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000424 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.440974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #51 NEW cov: 11756 ft: 14530 corp: 39/194b lim: 10 exec/s: 51 rss: 70Mb L: 4/10 MS: 1 ChangeBit- 00:07:44.472 [2024-11-08 04:51:19.470947] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000602c cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.470972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.471024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00006004 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.471038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 #52 NEW cov: 11756 ft: 14541 corp: 40/198b lim: 10 exec/s: 52 rss: 70Mb L: 4/10 MS: 1 CrossOver- 00:07:44.472 [2024-11-08 04:51:19.510955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9c5 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.510980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 #53 NEW cov: 11756 ft: 14552 corp: 41/201b lim: 10 exec/s: 53 rss: 70Mb L: 3/10 MS: 1 ChangeByte- 00:07:44.472 [2024-11-08 04:51:19.541312] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000260 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.541336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.541389] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00005050 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.541402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.541450] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00005050 cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.541464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.472 [2024-11-08 04:51:19.541512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000502c cdw11:00000000 00:07:44.472 [2024-11-08 04:51:19.541530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.472 #54 NEW cov: 11756 ft: 14554 corp: 42/209b lim: 10 exec/s: 54 rss: 70Mb L: 8/10 MS: 1 InsertRepeatedBytes- 00:07:44.730 [2024-11-08 04:51:19.581401] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:0000a9e3 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.581426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.581479] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00003100 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.581493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.581551] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.581565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.730 #55 NEW cov: 11756 ft: 14609 corp: 43/215b lim: 10 exec/s: 55 rss: 70Mb L: 6/10 MS: 1 ChangeByte- 00:07:44.730 [2024-11-08 04:51:19.621343] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000a0a cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.621369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.730 #59 NEW cov: 11756 ft: 14619 corp: 44/217b lim: 10 exec/s: 59 rss: 70Mb L: 2/10 MS: 4 ChangeBit-CrossOver-ChangeBit-CopyPart- 00:07:44.730 [2024-11-08 04:51:19.651723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.651747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.651800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.651813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.651863] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.651877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.651931] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.651945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.730 #60 NEW cov: 11756 ft: 14631 corp: 45/226b lim: 10 exec/s: 60 rss: 70Mb L: 9/10 MS: 1 ShuffleBytes- 00:07:44.730 [2024-11-08 04:51:19.692014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:4 nsid:0 cdw10:000004e1 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.692039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.692091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:5 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.692105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.692156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:6 nsid:0 cdw10:0000e1e1 cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.692170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.692221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:7 nsid:0 cdw10:0000040a cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.692235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:44.730 [2024-11-08 04:51:19.692287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DELETE IO SQ (00) qid:0 cid:8 nsid:0 cdw10:0000fefe cdw11:00000000 00:07:44.730 [2024-11-08 04:51:19.692300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:44.730 #61 NEW cov: 11756 ft: 14647 corp: 46/236b lim: 10 exec/s: 30 rss: 70Mb L: 10/10 MS: 1 CopyPart- 00:07:44.730 #61 DONE cov: 11756 ft: 14647 corp: 46/236b lim: 10 exec/s: 30 rss: 70Mb 00:07:44.730 Done 61 runs in 2 second(s) 00:07:44.730 04:51:19 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_7.conf 00:07:44.730 04:51:19 -- ../common.sh@72 -- # (( i++ )) 00:07:44.730 04:51:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:44.730 04:51:19 -- ../common.sh@73 -- # start_llvm_fuzz 8 1 0x1 00:07:44.730 04:51:19 -- nvmf/run.sh@23 -- # local fuzzer_type=8 00:07:44.730 04:51:19 -- nvmf/run.sh@24 -- # local timen=1 00:07:44.730 04:51:19 -- nvmf/run.sh@25 -- # local core=0x1 00:07:44.730 04:51:19 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:44.730 04:51:19 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_8.conf 00:07:44.730 04:51:19 -- nvmf/run.sh@29 -- # printf %02d 8 00:07:44.989 04:51:19 -- nvmf/run.sh@29 -- # port=4408 00:07:44.989 04:51:19 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:44.989 04:51:19 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' 00:07:44.989 04:51:19 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4408"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:44.989 04:51:19 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4408' -c /tmp/fuzz_json_8.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 -Z 8 -r /var/tmp/spdk8.sock 00:07:44.989 [2024-11-08 04:51:19.876825] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:44.989 [2024-11-08 04:51:19.876892] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3682408 ] 00:07:44.989 EAL: No free 2048 kB hugepages reported on node 1 00:07:44.989 [2024-11-08 04:51:20.064511] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:45.248 [2024-11-08 04:51:20.138261] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:45.248 [2024-11-08 04:51:20.138414] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.248 [2024-11-08 04:51:20.197169] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:45.248 [2024-11-08 04:51:20.213506] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4408 *** 00:07:45.248 INFO: Running with entropic power schedule (0xFF, 100). 00:07:45.248 INFO: Seed: 3620918344 00:07:45.248 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:45.248 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:45.248 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_8 00:07:45.248 INFO: A corpus is not provided, starting from an empty corpus 00:07:45.248 [2024-11-08 04:51:20.260941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.248 [2024-11-08 04:51:20.260970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.248 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:45.248 [2024-11-08 04:51:20.290944] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.248 [2024-11-08 04:51:20.290970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.248 #3 NEW cov: 11670 ft: 11953 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:45.248 [2024-11-08 04:51:20.331032] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.248 [2024-11-08 04:51:20.331058] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.506 #4 NEW cov: 11676 ft: 12270 corp: 3/3b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeByte- 00:07:45.506 [2024-11-08 04:51:20.371305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.506 [2024-11-08 04:51:20.371330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.506 [2024-11-08 04:51:20.371387] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.371400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.507 #5 NEW cov: 11761 ft: 13284 corp: 4/5b lim: 5 exec/s: 0 rss: 67Mb L: 2/2 MS: 1 CopyPart- 00:07:45.507 [2024-11-08 04:51:20.421632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.421656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.507 [2024-11-08 04:51:20.421712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.421725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.507 [2024-11-08 04:51:20.421779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.421792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.507 #6 NEW cov: 11761 ft: 13583 corp: 5/8b lim: 5 exec/s: 0 rss: 67Mb L: 3/3 MS: 1 InsertByte- 00:07:45.507 [2024-11-08 04:51:20.471436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.471461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.507 #7 NEW cov: 11761 ft: 13679 corp: 6/9b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:45.507 [2024-11-08 04:51:20.511511] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.511541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.507 #8 NEW cov: 11761 ft: 13768 corp: 7/10b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ChangeBit- 00:07:45.507 [2024-11-08 04:51:20.551846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.551872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.507 [2024-11-08 04:51:20.551928] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.551943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.507 #9 NEW cov: 11761 ft: 13820 corp: 8/12b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:07:45.507 [2024-11-08 04:51:20.591896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.591921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.507 [2024-11-08 04:51:20.591987] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.507 [2024-11-08 04:51:20.592001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.507 #10 NEW cov: 11761 ft: 13844 corp: 9/14b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 ChangeBit- 00:07:45.766 [2024-11-08 04:51:20.632002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.632027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.632083] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.632097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.766 #11 NEW cov: 11761 ft: 13880 corp: 10/16b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 CrossOver- 00:07:45.766 [2024-11-08 04:51:20.672153] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.672178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.672236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.672249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.766 #12 NEW cov: 11761 ft: 13923 corp: 11/18b lim: 5 exec/s: 0 rss: 67Mb L: 2/3 MS: 1 InsertByte- 00:07:45.766 [2024-11-08 04:51:20.712114] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.712139] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 #13 NEW cov: 11761 ft: 13945 corp: 12/19b lim: 5 exec/s: 0 rss: 67Mb L: 1/3 MS: 1 ShuffleBytes- 00:07:45.766 [2024-11-08 04:51:20.752356] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.752381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.752436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.752450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.766 #14 NEW cov: 11761 ft: 13988 corp: 13/21b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 InsertByte- 00:07:45.766 [2024-11-08 04:51:20.792470] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.792496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.792556] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.792571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.766 #15 NEW cov: 11761 ft: 13998 corp: 14/23b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeBit- 00:07:45.766 [2024-11-08 04:51:20.832759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.832785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.832840] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.832855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.832908] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.832922] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:45.766 #16 NEW cov: 11761 ft: 14031 corp: 15/26b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:07:45.766 [2024-11-08 04:51:20.872723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.872748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:45.766 [2024-11-08 04:51:20.872805] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:45.766 [2024-11-08 04:51:20.872820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.026 #17 NEW cov: 11761 ft: 14051 corp: 16/28b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ChangeByte- 00:07:46.026 [2024-11-08 04:51:20.913135] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.913160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:20.913215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.913228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:20.913284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.913298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:20.913351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.913364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.026 #18 NEW cov: 11761 ft: 14339 corp: 17/32b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 InsertRepeatedBytes- 00:07:46.026 [2024-11-08 04:51:20.952800] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.952825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 #19 NEW cov: 11761 ft: 14351 corp: 18/33b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.026 [2024-11-08 04:51:20.982878] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:20.982904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 #20 NEW cov: 11761 ft: 14439 corp: 19/34b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 ChangeBit- 00:07:46.026 [2024-11-08 04:51:21.023279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.023305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:21.023362] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.023376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:21.023431] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.023444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.026 #21 NEW cov: 11761 ft: 14481 corp: 20/37b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:46.026 [2024-11-08 04:51:21.063116] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.063142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 #22 NEW cov: 11761 ft: 14498 corp: 21/38b lim: 5 exec/s: 0 rss: 68Mb L: 1/4 MS: 1 EraseBytes- 00:07:46.026 [2024-11-08 04:51:21.103376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.103400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.026 [2024-11-08 04:51:21.103455] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.026 [2024-11-08 04:51:21.103468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.026 #23 NEW cov: 11761 ft: 14528 corp: 22/40b lim: 5 exec/s: 0 rss: 68Mb L: 2/4 MS: 1 InsertByte- 00:07:46.285 [2024-11-08 04:51:21.143359] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.285 [2024-11-08 04:51:21.143385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:46.544 #24 NEW cov: 11784 ft: 14571 corp: 23/41b lim: 5 exec/s: 24 rss: 69Mb L: 1/4 MS: 1 ChangeByte- 00:07:46.544 [2024-11-08 04:51:21.434458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.434491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.434549] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.434563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.434632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.434646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.544 #25 NEW cov: 11784 ft: 14638 corp: 24/44b lim: 5 exec/s: 25 rss: 69Mb L: 3/4 MS: 1 CrossOver- 00:07:46.544 [2024-11-08 04:51:21.474537] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.474563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.474618] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.474631] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.474683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.474696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.544 #26 NEW cov: 11784 ft: 14662 corp: 25/47b lim: 5 exec/s: 26 rss: 69Mb L: 3/4 MS: 1 CrossOver- 00:07:46.544 [2024-11-08 04:51:21.514821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.514846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.514901] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.514917] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.514971] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.514984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.515038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.515051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.544 #27 NEW cov: 11784 ft: 14667 corp: 26/51b lim: 5 exec/s: 27 rss: 69Mb L: 4/4 MS: 1 CopyPart- 00:07:46.544 [2024-11-08 04:51:21.554622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.554647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.554701] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.554715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.544 #28 NEW cov: 11784 ft: 14678 corp: 27/53b lim: 5 exec/s: 28 rss: 70Mb L: 2/4 MS: 1 CrossOver- 00:07:46.544 [2024-11-08 04:51:21.594576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.594601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 #29 NEW cov: 11784 ft: 14741 corp: 28/54b lim: 5 exec/s: 29 rss: 70Mb L: 1/4 MS: 1 ShuffleBytes- 00:07:46.544 [2024-11-08 04:51:21.635029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.635054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.635110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.635124] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.544 [2024-11-08 04:51:21.635178] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.544 [2024-11-08 04:51:21.635192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.803 #30 NEW cov: 11784 ft: 14759 corp: 29/57b lim: 5 exec/s: 30 rss: 70Mb L: 3/4 MS: 1 InsertByte- 00:07:46.803 [2024-11-08 04:51:21.675113] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.803 [2024-11-08 04:51:21.675138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.803 [2024-11-08 04:51:21.675193] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.803 [2024-11-08 04:51:21.675207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.803 [2024-11-08 04:51:21.675263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.675277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.804 #31 NEW cov: 11784 ft: 14810 corp: 30/60b lim: 5 exec/s: 31 rss: 70Mb L: 3/4 MS: 1 CopyPart- 00:07:46.804 [2024-11-08 04:51:21.715476] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.715501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.715562] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.715577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.715631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.715645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.715699] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.715712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.715768] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.715781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.804 #32 NEW cov: 11784 ft: 14943 corp: 31/65b lim: 5 exec/s: 32 rss: 70Mb L: 5/5 MS: 1 CrossOver- 00:07:46.804 [2024-11-08 04:51:21.755617] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.755642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.755697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.755710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.755764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.755777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.755831] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.755844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.804 #33 NEW cov: 11784 ft: 14955 corp: 32/69b lim: 5 exec/s: 33 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:46.804 [2024-11-08 04:51:21.795426] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.795454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.795508] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.795527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.804 #34 NEW cov: 11784 ft: 14967 corp: 33/71b lim: 5 exec/s: 34 rss: 70Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:46.804 [2024-11-08 04:51:21.835698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.835722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.835778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.835791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.835848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.835862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.804 #35 NEW cov: 11784 ft: 14986 corp: 34/74b lim: 5 exec/s: 35 rss: 70Mb L: 3/5 MS: 1 ShuffleBytes- 00:07:46.804 [2024-11-08 04:51:21.876134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.876159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.876215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.876229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.876283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.876296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.876350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.876363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:46.804 [2024-11-08 04:51:21.876417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000b cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:46.804 [2024-11-08 04:51:21.876430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:46.804 #36 NEW cov: 11784 ft: 14993 corp: 35/79b lim: 5 exec/s: 36 rss: 70Mb L: 5/5 MS: 1 CMP- DE: "\001\000"- 00:07:47.064 [2024-11-08 04:51:21.916306] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.916331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:21.916386] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.916402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:21.916458] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.916471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:21.916526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.916539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:21.916594] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.916607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.064 #37 NEW cov: 11784 ft: 15007 corp: 36/84b lim: 5 exec/s: 37 rss: 70Mb L: 5/5 MS: 1 CopyPart- 00:07:47.064 [2024-11-08 04:51:21.965910] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.965933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:21.965989] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:21.966003] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 #38 NEW cov: 11784 ft: 15023 corp: 37/86b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 InsertByte- 00:07:47.064 [2024-11-08 04:51:22.006206] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.006232] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.006286] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.006300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.006355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.006369] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.064 #39 NEW cov: 11784 ft: 15076 corp: 38/89b lim: 5 exec/s: 39 rss: 70Mb L: 3/5 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:47.064 [2024-11-08 04:51:22.046177] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.046202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.046247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.046260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.086287] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.086312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.086369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.086383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 #41 NEW cov: 11784 ft: 15101 corp: 39/91b lim: 5 exec/s: 41 rss: 70Mb L: 2/5 MS: 2 ShuffleBytes-ShuffleBytes- 00:07:47.064 [2024-11-08 04:51:22.126376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.126402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.126457] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.126471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.064 #42 NEW cov: 11784 ft: 15109 corp: 40/93b lim: 5 exec/s: 42 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:47.064 [2024-11-08 04:51:22.166535] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.166560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.064 [2024-11-08 04:51:22.166615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.064 [2024-11-08 04:51:22.166627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.324 #43 NEW cov: 11784 ft: 15133 corp: 41/95b lim: 5 exec/s: 43 rss: 70Mb L: 2/5 MS: 1 ChangeBinInt- 00:07:47.324 [2024-11-08 04:51:22.207133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.207158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.324 [2024-11-08 04:51:22.207213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.207226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.324 [2024-11-08 04:51:22.207279] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.207292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:47.324 [2024-11-08 04:51:22.207346] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.207359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:47.324 [2024-11-08 04:51:22.207412] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:8 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.207425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:47.324 #44 NEW cov: 11784 ft: 15155 corp: 42/100b lim: 5 exec/s: 44 rss: 70Mb L: 5/5 MS: 1 PersAutoDict- DE: "\001\000"- 00:07:47.324 [2024-11-08 04:51:22.246741] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:4 nsid:0 cdw10:00000004 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.246766] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.324 [2024-11-08 04:51:22.246820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE ATTACHMENT (15) qid:0 cid:5 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.324 [2024-11-08 04:51:22.246834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.324 #45 NEW cov: 11784 ft: 15162 corp: 43/102b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:47.324 #45 DONE cov: 11784 ft: 15162 corp: 43/102b lim: 5 exec/s: 22 rss: 70Mb 00:07:47.324 ###### Recommended dictionary. ###### 00:07:47.324 "\001\000" # Uses: 2 00:07:47.324 ###### End of recommended dictionary. ###### 00:07:47.324 Done 45 runs in 2 second(s) 00:07:47.324 04:51:22 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_8.conf 00:07:47.324 04:51:22 -- ../common.sh@72 -- # (( i++ )) 00:07:47.324 04:51:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:47.324 04:51:22 -- ../common.sh@73 -- # start_llvm_fuzz 9 1 0x1 00:07:47.324 04:51:22 -- nvmf/run.sh@23 -- # local fuzzer_type=9 00:07:47.324 04:51:22 -- nvmf/run.sh@24 -- # local timen=1 00:07:47.324 04:51:22 -- nvmf/run.sh@25 -- # local core=0x1 00:07:47.324 04:51:22 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:47.324 04:51:22 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_9.conf 00:07:47.324 04:51:22 -- nvmf/run.sh@29 -- # printf %02d 9 00:07:47.324 04:51:22 -- nvmf/run.sh@29 -- # port=4409 00:07:47.324 04:51:22 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:47.324 04:51:22 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' 00:07:47.324 04:51:22 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4409"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:47.324 04:51:22 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4409' -c /tmp/fuzz_json_9.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 -Z 9 -r /var/tmp/spdk9.sock 00:07:47.324 [2024-11-08 04:51:22.431460] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:47.324 [2024-11-08 04:51:22.431554] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3682790 ] 00:07:47.583 EAL: No free 2048 kB hugepages reported on node 1 00:07:47.583 [2024-11-08 04:51:22.609768] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:47.583 [2024-11-08 04:51:22.673125] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:47.583 [2024-11-08 04:51:22.673269] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:47.842 [2024-11-08 04:51:22.731700] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:47.842 [2024-11-08 04:51:22.748031] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4409 *** 00:07:47.842 INFO: Running with entropic power schedule (0xFF, 100). 00:07:47.842 INFO: Seed: 1863901126 00:07:47.842 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:47.842 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:47.842 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_9 00:07:47.842 INFO: A corpus is not provided, starting from an empty corpus 00:07:47.842 [2024-11-08 04:51:22.793275] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.793304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.842 #2 INITED cov: 11557 ft: 11558 corp: 1/1b exec/s: 0 rss: 67Mb 00:07:47.842 [2024-11-08 04:51:22.823247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.823272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.842 #3 NEW cov: 11670 ft: 12116 corp: 2/2b lim: 5 exec/s: 0 rss: 67Mb L: 1/1 MS: 1 ChangeBinInt- 00:07:47.842 [2024-11-08 04:51:22.863381] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.863407] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.842 #4 NEW cov: 11676 ft: 12271 corp: 3/3b lim: 5 exec/s: 0 rss: 68Mb L: 1/1 MS: 1 ChangeByte- 00:07:47.842 [2024-11-08 04:51:22.903653] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.903678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.842 [2024-11-08 04:51:22.903735] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.903748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:47.842 #5 NEW cov: 11761 ft: 13293 corp: 4/5b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:47.842 [2024-11-08 04:51:22.943798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.943823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:47.842 [2024-11-08 04:51:22.943880] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:47.842 [2024-11-08 04:51:22.943893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #6 NEW cov: 11761 ft: 13513 corp: 5/7b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CrossOver- 00:07:48.101 [2024-11-08 04:51:22.983867] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:22.983892] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 [2024-11-08 04:51:22.983948] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:22.983961] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #7 NEW cov: 11761 ft: 13673 corp: 6/9b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:48.101 [2024-11-08 04:51:23.023959] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.023984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 [2024-11-08 04:51:23.024040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.024056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #8 NEW cov: 11761 ft: 13863 corp: 7/11b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:48.101 [2024-11-08 04:51:23.063917] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.063941] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 #9 NEW cov: 11761 ft: 13986 corp: 8/12b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 ChangeByte- 00:07:48.101 [2024-11-08 04:51:23.104189] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.104215] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 [2024-11-08 04:51:23.104272] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000a cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.104286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #10 NEW cov: 11761 ft: 14051 corp: 9/14b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:48.101 [2024-11-08 04:51:23.144302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.144327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 [2024-11-08 04:51:23.144385] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.144399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #11 NEW cov: 11761 ft: 14070 corp: 10/16b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CMP- DE: "\377\013"- 00:07:48.101 [2024-11-08 04:51:23.184443] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.184468] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.101 [2024-11-08 04:51:23.184530] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.101 [2024-11-08 04:51:23.184545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.101 #12 NEW cov: 11761 ft: 14090 corp: 11/18b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 InsertByte- 00:07:48.360 [2024-11-08 04:51:23.224554] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.224579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.224636] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.224650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.360 #13 NEW cov: 11761 ft: 14149 corp: 12/20b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:48.360 [2024-11-08 04:51:23.264690] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.264718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.264774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.264789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.360 #14 NEW cov: 11761 ft: 14171 corp: 13/22b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 ChangeBit- 00:07:48.360 [2024-11-08 04:51:23.304635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.304660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 #15 NEW cov: 11761 ft: 14221 corp: 14/23b lim: 5 exec/s: 0 rss: 68Mb L: 1/2 MS: 1 CrossOver- 00:07:48.360 [2024-11-08 04:51:23.344924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.344950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.345006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.345020] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.360 #16 NEW cov: 11761 ft: 14264 corp: 15/25b lim: 5 exec/s: 0 rss: 68Mb L: 2/2 MS: 1 CopyPart- 00:07:48.360 [2024-11-08 04:51:23.385170] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.385195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.385249] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.385262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.385317] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.385331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.360 #17 NEW cov: 11761 ft: 14452 corp: 16/28b lim: 5 exec/s: 0 rss: 68Mb L: 3/3 MS: 1 InsertByte- 00:07:48.360 [2024-11-08 04:51:23.425134] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.425158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.360 [2024-11-08 04:51:23.425213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.360 [2024-11-08 04:51:23.425227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.360 #18 NEW cov: 11761 ft: 14462 corp: 17/30b lim: 5 exec/s: 0 rss: 68Mb L: 2/3 MS: 1 ShuffleBytes- 00:07:48.361 [2024-11-08 04:51:23.465624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.361 [2024-11-08 04:51:23.465652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.361 [2024-11-08 04:51:23.465711] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.361 [2024-11-08 04:51:23.465725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.361 [2024-11-08 04:51:23.465779] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.361 [2024-11-08 04:51:23.465792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.361 [2024-11-08 04:51:23.465848] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.361 [2024-11-08 04:51:23.465862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.620 #19 NEW cov: 11761 ft: 14795 corp: 18/34b lim: 5 exec/s: 0 rss: 68Mb L: 4/4 MS: 1 PersAutoDict- DE: "\377\013"- 00:07:48.620 [2024-11-08 04:51:23.515577] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.515603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.515661] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.515674] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.515728] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000008 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.515742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.620 #20 NEW cov: 11761 ft: 14810 corp: 19/37b lim: 5 exec/s: 0 rss: 68Mb L: 3/4 MS: 1 InsertByte- 00:07:48.620 [2024-11-08 04:51:23.556004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.556029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.556086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.556101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.556154] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.556168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.556221] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.556235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.556290] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.556306] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:48.620 #21 NEW cov: 11761 ft: 14898 corp: 20/42b lim: 5 exec/s: 0 rss: 68Mb L: 5/5 MS: 1 CMP- DE: "\005\000\000\000"- 00:07:48.620 [2024-11-08 04:51:23.595620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.595645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.595702] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.595715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.620 #22 NEW cov: 11761 ft: 14909 corp: 21/44b lim: 5 exec/s: 0 rss: 68Mb L: 2/5 MS: 1 InsertByte- 00:07:48.620 [2024-11-08 04:51:23.635778] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.635804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.620 [2024-11-08 04:51:23.635860] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.635874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:48.620 #23 NEW cov: 11761 ft: 14981 corp: 22/46b lim: 5 exec/s: 0 rss: 69Mb L: 2/5 MS: 1 ShuffleBytes- 00:07:48.620 [2024-11-08 04:51:23.675696] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.620 [2024-11-08 04:51:23.675721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.880 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:48.880 #24 NEW cov: 11784 ft: 15021 corp: 23/47b lim: 5 exec/s: 24 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:48.880 [2024-11-08 04:51:23.976798] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.880 [2024-11-08 04:51:23.976830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:48.880 [2024-11-08 04:51:23.976890] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:48.880 [2024-11-08 04:51:23.976904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.140 #25 NEW cov: 11784 ft: 15036 corp: 24/49b lim: 5 exec/s: 25 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:49.140 [2024-11-08 04:51:24.017198] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.017224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.017283] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.017296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.017353] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.017375] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.017433] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.017446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.140 #26 NEW cov: 11784 ft: 15053 corp: 25/53b lim: 5 exec/s: 26 rss: 70Mb L: 4/5 MS: 1 CrossOver- 00:07:49.140 [2024-11-08 04:51:24.067143] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.067169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.067228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.067242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.067299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.067313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.140 #27 NEW cov: 11784 ft: 15072 corp: 26/56b lim: 5 exec/s: 27 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:07:49.140 [2024-11-08 04:51:24.106921] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.106946] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.140 #28 NEW cov: 11784 ft: 15114 corp: 27/57b lim: 5 exec/s: 28 rss: 70Mb L: 1/5 MS: 1 EraseBytes- 00:07:49.140 [2024-11-08 04:51:24.147006] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000006 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.147031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.140 #29 NEW cov: 11784 ft: 15126 corp: 28/58b lim: 5 exec/s: 29 rss: 70Mb L: 1/5 MS: 1 ChangeByte- 00:07:49.140 [2024-11-08 04:51:24.187827] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.140 [2024-11-08 04:51:24.187853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.140 [2024-11-08 04:51:24.187914] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.187928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.141 [2024-11-08 04:51:24.187984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.187998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.141 [2024-11-08 04:51:24.188053] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.188067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.141 [2024-11-08 04:51:24.188129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.188143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.141 #30 NEW cov: 11784 ft: 15167 corp: 29/63b lim: 5 exec/s: 30 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:49.141 [2024-11-08 04:51:24.237467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.237492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.141 [2024-11-08 04:51:24.237555] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.141 [2024-11-08 04:51:24.237572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 #31 NEW cov: 11784 ft: 15202 corp: 30/65b lim: 5 exec/s: 31 rss: 70Mb L: 2/5 MS: 1 CopyPart- 00:07:49.400 [2024-11-08 04:51:24.277756] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000009 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.277781] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.277839] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.277853] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.277912] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.277926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.400 #32 NEW cov: 11784 ft: 15210 corp: 31/68b lim: 5 exec/s: 32 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:49.400 [2024-11-08 04:51:24.317538] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.317563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 #33 NEW cov: 11784 ft: 15227 corp: 32/69b lim: 5 exec/s: 33 rss: 70Mb L: 1/5 MS: 1 ChangeBinInt- 00:07:49.400 [2024-11-08 04:51:24.357981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.358006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.358064] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.358078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.358137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.358150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.400 #34 NEW cov: 11784 ft: 15254 corp: 33/72b lim: 5 exec/s: 34 rss: 70Mb L: 3/5 MS: 1 InsertByte- 00:07:49.400 [2024-11-08 04:51:24.397951] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.397976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.398033] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.398047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 #35 NEW cov: 11784 ft: 15268 corp: 34/74b lim: 5 exec/s: 35 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:49.400 [2024-11-08 04:51:24.438598] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.438623] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.438681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.438695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.438751] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.438764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.438821] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.438834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.438891] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.438904] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.400 #36 NEW cov: 11784 ft: 15275 corp: 35/79b lim: 5 exec/s: 36 rss: 70Mb L: 5/5 MS: 1 ChangeBit- 00:07:49.400 [2024-11-08 04:51:24.478336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000f cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.478361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.478419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.400 [2024-11-08 04:51:24.478432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.400 [2024-11-08 04:51:24.478490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000007 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.401 [2024-11-08 04:51:24.478503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.401 #37 NEW cov: 11784 ft: 15329 corp: 36/82b lim: 5 exec/s: 37 rss: 70Mb L: 3/5 MS: 1 ChangeByte- 00:07:49.660 [2024-11-08 04:51:24.518242] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.518266] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.518328] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.518342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.660 #38 NEW cov: 11784 ft: 15334 corp: 37/84b lim: 5 exec/s: 38 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:49.660 [2024-11-08 04:51:24.558729] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.558754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.558814] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.558827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.558884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000005 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.558897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.558955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000002 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.558968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.660 #39 NEW cov: 11784 ft: 15349 corp: 38/88b lim: 5 exec/s: 39 rss: 70Mb L: 4/5 MS: 1 InsertRepeatedBytes- 00:07:49.660 [2024-11-08 04:51:24.598499] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.598527] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.598587] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000d cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.598604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.660 #40 NEW cov: 11784 ft: 15355 corp: 39/90b lim: 5 exec/s: 40 rss: 70Mb L: 2/5 MS: 1 ChangeByte- 00:07:49.660 [2024-11-08 04:51:24.639091] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.639116] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.639176] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.639190] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.639247] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.639262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.639320] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.639334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.639395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.639411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:49.660 #41 NEW cov: 11784 ft: 15358 corp: 40/95b lim: 5 exec/s: 41 rss: 70Mb L: 5/5 MS: 1 InsertRepeatedBytes- 00:07:49.660 [2024-11-08 04:51:24.678552] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:0000000e cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.678577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.718710] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000003 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.718735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 #43 NEW cov: 11784 ft: 15396 corp: 41/96b lim: 5 exec/s: 43 rss: 70Mb L: 1/5 MS: 2 ChangeBit-ChangeByte- 00:07:49.660 [2024-11-08 04:51:24.758984] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:4 nsid:0 cdw10:00000001 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.759009] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:49.660 [2024-11-08 04:51:24.759068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: NAMESPACE MANAGEMENT (0d) qid:0 cid:5 nsid:0 cdw10:0000000c cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:49.660 [2024-11-08 04:51:24.759082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:49.920 #44 NEW cov: 11784 ft: 15401 corp: 42/98b lim: 5 exec/s: 22 rss: 70Mb L: 2/5 MS: 1 ChangeBit- 00:07:49.920 #44 DONE cov: 11784 ft: 15401 corp: 42/98b lim: 5 exec/s: 22 rss: 70Mb 00:07:49.920 ###### Recommended dictionary. ###### 00:07:49.920 "\377\013" # Uses: 1 00:07:49.920 "\005\000\000\000" # Uses: 0 00:07:49.920 ###### End of recommended dictionary. ###### 00:07:49.920 Done 44 runs in 2 second(s) 00:07:49.920 04:51:24 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_9.conf 00:07:49.920 04:51:24 -- ../common.sh@72 -- # (( i++ )) 00:07:49.920 04:51:24 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:49.920 04:51:24 -- ../common.sh@73 -- # start_llvm_fuzz 10 1 0x1 00:07:49.920 04:51:24 -- nvmf/run.sh@23 -- # local fuzzer_type=10 00:07:49.920 04:51:24 -- nvmf/run.sh@24 -- # local timen=1 00:07:49.920 04:51:24 -- nvmf/run.sh@25 -- # local core=0x1 00:07:49.920 04:51:24 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:49.920 04:51:24 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_10.conf 00:07:49.920 04:51:24 -- nvmf/run.sh@29 -- # printf %02d 10 00:07:49.920 04:51:24 -- nvmf/run.sh@29 -- # port=4410 00:07:49.920 04:51:24 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:49.920 04:51:24 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' 00:07:49.920 04:51:24 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4410"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:49.920 04:51:24 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4410' -c /tmp/fuzz_json_10.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 -Z 10 -r /var/tmp/spdk10.sock 00:07:49.920 [2024-11-08 04:51:24.951176] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:49.920 [2024-11-08 04:51:24.951269] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3683240 ] 00:07:49.920 EAL: No free 2048 kB hugepages reported on node 1 00:07:50.180 [2024-11-08 04:51:25.128291] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:50.180 [2024-11-08 04:51:25.194772] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:50.180 [2024-11-08 04:51:25.194915] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:50.180 [2024-11-08 04:51:25.253530] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:50.180 [2024-11-08 04:51:25.269860] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4410 *** 00:07:50.180 INFO: Running with entropic power schedule (0xFF, 100). 00:07:50.180 INFO: Seed: 88937341 00:07:50.439 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:50.439 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:50.439 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_10 00:07:50.439 INFO: A corpus is not provided, starting from an empty corpus 00:07:50.439 #2 INITED exec/s: 0 rss: 61Mb 00:07:50.439 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:50.439 This may also happen if the target rejected all inputs we tried so far 00:07:50.439 [2024-11-08 04:51:25.318407] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.439 [2024-11-08 04:51:25.318443] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.439 [2024-11-08 04:51:25.318493] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.439 [2024-11-08 04:51:25.318511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.699 NEW_FUNC[1/669]: 0x447688 in fuzz_admin_security_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:205 00:07:50.699 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:50.699 #6 NEW cov: 11578 ft: 11577 corp: 2/18b lim: 40 exec/s: 0 rss: 68Mb L: 17/17 MS: 4 ChangeByte-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:50.699 [2024-11-08 04:51:25.639200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.639238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.699 [2024-11-08 04:51:25.639288] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:ffff1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.639307] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.699 [2024-11-08 04:51:25.639337] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.639353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:50.699 NEW_FUNC[1/1]: 0x15097e8 in nvme_qpair_get_state /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvme/./nvme_internal.h:1456 00:07:50.699 #7 NEW cov: 11693 ft: 12188 corp: 3/44b lim: 40 exec/s: 0 rss: 68Mb L: 26/26 MS: 1 InsertRepeatedBytes- 00:07:50.699 [2024-11-08 04:51:25.709267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.709299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.699 [2024-11-08 04:51:25.709336] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.709352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.699 #8 NEW cov: 11699 ft: 12500 corp: 4/62b lim: 40 exec/s: 0 rss: 68Mb L: 18/26 MS: 1 CrossOver- 00:07:50.699 [2024-11-08 04:51:25.759354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.759384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.699 [2024-11-08 04:51:25.759432] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161600 cdw11:12161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.699 [2024-11-08 04:51:25.759455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.958 #9 NEW cov: 11784 ft: 12886 corp: 5/80b lim: 40 exec/s: 0 rss: 68Mb L: 18/26 MS: 1 ChangeBinInt- 00:07:50.958 [2024-11-08 04:51:25.829489] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.958 [2024-11-08 04:51:25.829519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.958 #15 NEW cov: 11784 ft: 13250 corp: 6/94b lim: 40 exec/s: 0 rss: 69Mb L: 14/26 MS: 1 EraseBytes- 00:07:50.958 [2024-11-08 04:51:25.889698] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.958 [2024-11-08 04:51:25.889728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.958 [2024-11-08 04:51:25.889776] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161600 cdw11:ee161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.958 [2024-11-08 04:51:25.889791] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.959 #16 NEW cov: 11784 ft: 13359 corp: 7/112b lim: 40 exec/s: 0 rss: 69Mb L: 18/26 MS: 1 ChangeBinInt- 00:07:50.959 [2024-11-08 04:51:25.939803] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:25.939835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.959 [2024-11-08 04:51:25.939884] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:161600ee SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:25.939907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.959 #17 NEW cov: 11784 ft: 13457 corp: 8/133b lim: 40 exec/s: 0 rss: 69Mb L: 21/26 MS: 1 InsertRepeatedBytes- 00:07:50.959 [2024-11-08 04:51:25.999985] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161607 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:26.000014] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.959 [2024-11-08 04:51:26.000062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:26.000084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:50.959 #18 NEW cov: 11784 ft: 13554 corp: 9/151b lim: 40 exec/s: 0 rss: 69Mb L: 18/26 MS: 1 CopyPart- 00:07:50.959 [2024-11-08 04:51:26.050096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:26.050125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:50.959 [2024-11-08 04:51:26.050173] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:1616165c cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:50.959 [2024-11-08 04:51:26.050195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 #19 NEW cov: 11784 ft: 13632 corp: 10/170b lim: 40 exec/s: 0 rss: 69Mb L: 19/26 MS: 1 InsertByte- 00:07:51.218 [2024-11-08 04:51:26.100225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.100254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.100302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16c81600 cdw11:12161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.100324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 #20 NEW cov: 11784 ft: 13670 corp: 11/188b lim: 40 exec/s: 0 rss: 69Mb L: 18/26 MS: 1 ChangeByte- 00:07:51.218 [2024-11-08 04:51:26.150350] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.150379] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.150427] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.150449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 #21 NEW cov: 11784 ft: 13739 corp: 12/205b lim: 40 exec/s: 0 rss: 69Mb L: 17/26 MS: 1 ShuffleBytes- 00:07:51.218 [2024-11-08 04:51:26.200496] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.200533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.200582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.200606] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:51.218 #22 NEW cov: 11801 ft: 13785 corp: 13/222b lim: 40 exec/s: 0 rss: 69Mb L: 17/26 MS: 1 EraseBytes- 00:07:51.218 [2024-11-08 04:51:26.250631] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.250660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.250708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161600 cdw11:12161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.250730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 #23 NEW cov: 11801 ft: 13865 corp: 14/240b lim: 40 exec/s: 0 rss: 69Mb L: 18/26 MS: 1 CrossOver- 00:07:51.218 [2024-11-08 04:51:26.300793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.300823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.300871] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:00000016 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.300893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.218 [2024-11-08 04:51:26.300923] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:1600ee16 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.218 [2024-11-08 04:51:26.300938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.478 #24 NEW cov: 11801 ft: 13888 corp: 15/264b lim: 40 exec/s: 24 rss: 69Mb L: 24/26 MS: 1 CopyPart- 00:07:51.478 [2024-11-08 04:51:26.361060] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.478 [2024-11-08 04:51:26.361089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.478 [2024-11-08 04:51:26.361137] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.478 [2024-11-08 04:51:26.361161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.478 [2024-11-08 04:51:26.361191] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.478 [2024-11-08 04:51:26.361205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.478 [2024-11-08 04:51:26.361234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000c816 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.478 [2024-11-08 04:51:26.361249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.478 #25 NEW cov: 11801 ft: 14362 corp: 16/303b lim: 40 exec/s: 25 rss: 69Mb L: 39/39 MS: 1 InsertRepeatedBytes- 00:07:51.478 [2024-11-08 04:51:26.431128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07161616 cdw11:16161600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.478 [2024-11-08 04:51:26.431157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.431205] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.431220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.479 #26 NEW cov: 11801 ft: 14383 corp: 17/320b lim: 40 exec/s: 26 rss: 69Mb L: 17/39 MS: 1 CrossOver- 00:07:51.479 [2024-11-08 04:51:26.491333] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1607 cdw11:0a161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.491362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.491410] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16160000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.491433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.491463] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00001616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.491478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.479 #27 NEW cov: 11801 ft: 14393 corp: 18/350b lim: 40 exec/s: 27 rss: 69Mb L: 30/39 MS: 1 CrossOver- 00:07:51.479 [2024-11-08 04:51:26.561576] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161627 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.561605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.561654] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.561675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.561704] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.561719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.479 [2024-11-08 04:51:26.561748] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:0000c816 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.479 [2024-11-08 04:51:26.561763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.738 #28 NEW cov: 11801 ft: 14406 corp: 19/389b lim: 40 exec/s: 28 rss: 69Mb L: 39/39 MS: 1 ChangeBinInt- 00:07:51.738 [2024-11-08 04:51:26.621712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.621741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.621790] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.621813] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.621843] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.621859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.621888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:1600c800 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.621903] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.738 #29 NEW cov: 11801 ft: 14420 corp: 20/428b lim: 40 exec/s: 29 rss: 69Mb L: 39/39 MS: 1 ShuffleBytes- 00:07:51.738 [2024-11-08 04:51:26.671730] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07081616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.671760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.671808] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161600 cdw11:12161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.671829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.738 #30 NEW cov: 11801 ft: 14430 corp: 21/446b lim: 40 exec/s: 30 rss: 69Mb L: 18/39 MS: 1 ChangeBit- 00:07:51.738 [2024-11-08 04:51:26.721857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:01161616 cdw11:16161600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.721886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.721934] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.721955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.738 #31 NEW cov: 11801 ft: 14442 corp: 22/463b lim: 40 exec/s: 31 rss: 69Mb L: 17/39 MS: 1 ChangeBinInt- 00:07:51.738 [2024-11-08 04:51:26.782042] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.782071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.738 [2024-11-08 04:51:26.782120] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161628 cdw11:ee161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.782142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.738 #32 NEW cov: 11801 ft: 14454 corp: 23/481b lim: 40 exec/s: 32 rss: 69Mb L: 18/39 MS: 1 ChangeByte- 00:07:51.738 [2024-11-08 04:51:26.832241] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:1616070a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.738 [2024-11-08 04:51:26.832270] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.739 [2024-11-08 04:51:26.832302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.739 [2024-11-08 04:51:26.832316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.739 [2024-11-08 04:51:26.832351] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.739 [2024-11-08 04:51:26.832365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.739 [2024-11-08 04:51:26.832392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.739 [2024-11-08 04:51:26.832406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:51.998 #33 NEW cov: 11801 ft: 14458 corp: 24/514b lim: 40 exec/s: 33 rss: 69Mb L: 33/39 MS: 1 CopyPart- 00:07:51.998 [2024-11-08 04:51:26.902373] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161607 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:26.902403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:26.902451] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:0a161616 cdw11:16165616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:26.902466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.998 #34 NEW cov: 11801 ft: 14475 corp: 25/532b lim: 40 exec/s: 34 rss: 69Mb L: 18/39 MS: 1 ChangeBit- 00:07:51.998 [2024-11-08 04:51:26.962603] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:26.962633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:26.962681] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161600 cdw11:12161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:26.962704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:26.962734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16160116 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:26.962749] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.998 #35 NEW cov: 11801 ft: 14543 corp: 26/556b lim: 40 exec/s: 35 rss: 70Mb L: 24/39 MS: 1 CrossOver- 00:07:51.998 [2024-11-08 04:51:27.012592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:27.012622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.998 #37 NEW cov: 11801 ft: 14552 corp: 27/569b lim: 40 exec/s: 37 rss: 70Mb L: 13/39 MS: 2 EraseBytes-InsertRepeatedBytes- 00:07:51.998 [2024-11-08 04:51:27.082962] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:27.082992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:27.083041] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16160012 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:27.083056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:27.083085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16011616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:27.083100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:51.998 [2024-11-08 04:51:27.083129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:16001216 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:51.998 [2024-11-08 04:51:27.083143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.258 #38 NEW cov: 11801 ft: 14618 corp: 28/607b lim: 40 exec/s: 38 rss: 70Mb L: 38/39 MS: 1 CopyPart- 00:07:52.258 [2024-11-08 04:51:27.153156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.153186] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.153234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:070a1616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.153255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.153285] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161607 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.153304] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.153334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:0a161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.153348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.258 #39 NEW cov: 11801 ft: 14649 corp: 29/641b lim: 40 exec/s: 39 rss: 70Mb L: 34/39 MS: 1 CopyPart- 00:07:52.258 [2024-11-08 04:51:27.213238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07ff0a16 cdw11:070a1616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.213269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.213318] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:16161600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.213339] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.213369] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:00000016 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.213384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.258 #40 NEW cov: 11808 ft: 14663 corp: 30/672b lim: 40 exec/s: 40 rss: 70Mb L: 31/39 MS: 1 CrossOver- 00:07:52.258 [2024-11-08 04:51:27.273482] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:4 nsid:0 cdw10:07161616 cdw11:16161600 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.273513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.273570] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:5 nsid:0 cdw10:16161616 cdw11:1616070a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.273585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.273615] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:6 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.273630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:52.258 [2024-11-08 04:51:27.273659] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY RECEIVE (82) qid:0 cid:7 nsid:0 cdw10:16161616 cdw11:16161616 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:52.258 [2024-11-08 04:51:27.273673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:52.258 #41 NEW cov: 11808 ft: 14670 corp: 31/707b lim: 40 exec/s: 20 rss: 70Mb L: 35/39 MS: 1 CrossOver- 00:07:52.258 #41 DONE cov: 11808 ft: 14670 corp: 31/707b lim: 40 exec/s: 20 rss: 70Mb 00:07:52.258 Done 41 runs in 2 second(s) 00:07:52.518 04:51:27 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_10.conf 00:07:52.518 04:51:27 -- ../common.sh@72 -- # (( i++ )) 00:07:52.518 04:51:27 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:52.518 04:51:27 -- ../common.sh@73 -- # start_llvm_fuzz 11 1 0x1 00:07:52.518 04:51:27 -- nvmf/run.sh@23 -- # local fuzzer_type=11 00:07:52.518 04:51:27 -- nvmf/run.sh@24 -- # local timen=1 00:07:52.518 04:51:27 -- nvmf/run.sh@25 -- # local core=0x1 00:07:52.518 04:51:27 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:52.518 04:51:27 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_11.conf 00:07:52.518 04:51:27 -- nvmf/run.sh@29 -- # printf %02d 11 00:07:52.518 04:51:27 -- nvmf/run.sh@29 -- # port=4411 00:07:52.518 04:51:27 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:52.518 04:51:27 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' 00:07:52.518 04:51:27 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4411"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:52.518 04:51:27 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4411' -c /tmp/fuzz_json_11.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 -Z 11 -r /var/tmp/spdk11.sock 00:07:52.518 [2024-11-08 04:51:27.481568] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:52.518 [2024-11-08 04:51:27.481659] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3683783 ] 00:07:52.518 EAL: No free 2048 kB hugepages reported on node 1 00:07:52.778 [2024-11-08 04:51:27.662257] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:52.778 [2024-11-08 04:51:27.726899] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:52.778 [2024-11-08 04:51:27.727025] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.778 [2024-11-08 04:51:27.785148] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:52.778 [2024-11-08 04:51:27.801456] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4411 *** 00:07:52.778 INFO: Running with entropic power schedule (0xFF, 100). 00:07:52.778 INFO: Seed: 2621949101 00:07:52.778 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:52.778 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:52.778 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_11 00:07:52.778 INFO: A corpus is not provided, starting from an empty corpus 00:07:52.778 #2 INITED exec/s: 0 rss: 59Mb 00:07:52.778 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:52.778 This may also happen if the target rejected all inputs we tried so far 00:07:52.778 [2024-11-08 04:51:27.846110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0acb9b55 cdw11:19679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:52.778 [2024-11-08 04:51:27.846145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.296 NEW_FUNC[1/671]: 0x4493f8 in fuzz_admin_security_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:223 00:07:53.296 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:53.296 #3 NEW cov: 11592 ft: 11593 corp: 2/10b lim: 40 exec/s: 0 rss: 67Mb L: 9/9 MS: 1 CMP- DE: "\313\233U\031g\226\203\000"- 00:07:53.296 [2024-11-08 04:51:28.166907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0acb9b55 cdw11:19cb9b55 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.296 [2024-11-08 04:51:28.166945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.296 [2024-11-08 04:51:28.166994] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:19679683 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.296 [2024-11-08 04:51:28.167015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:53.296 #9 NEW cov: 11705 ft: 12767 corp: 3/26b lim: 40 exec/s: 0 rss: 67Mb L: 16/16 MS: 1 CopyPart- 00:07:53.296 [2024-11-08 04:51:28.236927] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a679683 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.296 [2024-11-08 04:51:28.236962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.296 #10 NEW cov: 11711 ft: 13035 corp: 4/34b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 EraseBytes- 00:07:53.296 [2024-11-08 04:51:28.297071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.296 [2024-11-08 04:51:28.297100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.296 #11 NEW cov: 11796 ft: 13378 corp: 5/42b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 PersAutoDict- DE: "\313\233U\031g\226\203\000"- 00:07:53.296 [2024-11-08 04:51:28.357225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a6a9683 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.296 [2024-11-08 04:51:28.357254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.296 #12 NEW cov: 11796 ft: 13459 corp: 6/50b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 ChangeBinInt- 00:07:53.555 [2024-11-08 04:51:28.407376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a679683 cdw11:67836796 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.555 [2024-11-08 04:51:28.407406] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.555 #13 NEW cov: 11796 ft: 13604 corp: 7/60b lim: 40 exec/s: 0 rss: 67Mb L: 10/16 MS: 1 CopyPart- 00:07:53.555 [2024-11-08 04:51:28.457464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.555 [2024-11-08 04:51:28.457494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.555 #14 NEW cov: 11796 ft: 13709 corp: 8/68b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 ShuffleBytes- 00:07:53.556 [2024-11-08 04:51:28.527676] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6956973 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.556 [2024-11-08 04:51:28.527706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.556 #15 NEW cov: 11796 ft: 13735 corp: 9/76b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 ChangeBinInt- 00:07:53.556 [2024-11-08 04:51:28.587846] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:679b5500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.556 [2024-11-08 04:51:28.587876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.556 #16 NEW cov: 11796 ft: 13782 corp: 10/84b lim: 40 exec/s: 0 rss: 67Mb L: 8/16 MS: 1 CopyPart- 00:07:53.556 [2024-11-08 04:51:28.648012] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.556 [2024-11-08 04:51:28.648042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.814 #20 NEW cov: 11796 ft: 13835 corp: 11/97b lim: 40 exec/s: 0 rss: 67Mb L: 13/16 MS: 4 CrossOver-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:07:53.814 [2024-11-08 04:51:28.698093] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.814 [2024-11-08 04:51:28.698122] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.814 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:53.814 #21 NEW cov: 11813 ft: 13869 corp: 12/106b lim: 40 exec/s: 0 rss: 68Mb L: 9/16 MS: 1 PersAutoDict- DE: "\313\233U\031g\226\203\000"- 00:07:53.814 [2024-11-08 04:51:28.758255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:55196796 cdw11:83008300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.814 [2024-11-08 04:51:28.758286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.814 #22 NEW cov: 11813 ft: 13888 corp: 13/114b lim: 40 exec/s: 0 rss: 68Mb L: 8/16 MS: 1 CrossOver- 00:07:53.814 [2024-11-08 04:51:28.818472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:2a679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.814 [2024-11-08 04:51:28.818504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.814 #23 NEW cov: 11813 ft: 13907 corp: 14/123b lim: 40 exec/s: 23 rss: 68Mb L: 9/16 MS: 1 InsertByte- 00:07:53.814 [2024-11-08 04:51:28.868545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a672e83 cdw11:67836796 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:53.814 [2024-11-08 04:51:28.868598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:53.814 #24 NEW cov: 11813 ft: 13921 corp: 15/133b lim: 40 exec/s: 24 rss: 68Mb L: 10/16 MS: 1 ChangeByte- 00:07:54.073 [2024-11-08 04:51:28.938888] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0acb9b55 cdw11:19679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:28.938921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.073 [2024-11-08 04:51:28.938955] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00679683 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:28.938971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.073 #25 NEW cov: 11813 ft: 13937 corp: 16/149b lim: 40 exec/s: 25 rss: 68Mb L: 16/16 MS: 1 PersAutoDict- DE: "\313\233U\031g\226\203\000"- 00:07:54.073 [2024-11-08 04:51:28.998946] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:28.998976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.073 [2024-11-08 04:51:28.999024] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:28.999040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.073 #26 NEW cov: 11813 ft: 13947 corp: 17/168b lim: 40 exec/s: 26 rss: 68Mb L: 19/19 MS: 1 InsertRepeatedBytes- 00:07:54.073 [2024-11-08 04:51:29.059038] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cbcb9b55 cdw11:19679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:29.059067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.073 #27 NEW cov: 11813 ft: 13968 corp: 18/177b lim: 40 exec/s: 27 rss: 68Mb L: 9/19 MS: 1 PersAutoDict- DE: "\313\233U\031g\226\203\000"- 00:07:54.073 [2024-11-08 04:51:29.119256] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:29.119286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.073 #28 NEW cov: 11813 ft: 14085 corp: 19/190b lim: 40 exec/s: 28 rss: 68Mb L: 13/19 MS: 1 ChangeBinInt- 00:07:54.073 [2024-11-08 04:51:29.179527] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a670a67 cdw11:96968367 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:29.179561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.073 [2024-11-08 04:51:29.179611] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:96830083 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.073 [2024-11-08 04:51:29.179627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.332 #29 NEW cov: 11813 ft: 14098 corp: 20/206b lim: 40 exec/s: 29 rss: 68Mb L: 16/19 MS: 1 CrossOver- 00:07:54.332 [2024-11-08 04:51:29.239520] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:96836796 cdw11:83000a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.239555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.332 #30 NEW cov: 11813 ft: 14178 corp: 21/220b lim: 40 exec/s: 30 rss: 68Mb L: 14/19 MS: 1 CopyPart- 00:07:54.332 [2024-11-08 04:51:29.289668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b3019 cdw11:679b5500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.289699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.332 #31 NEW cov: 11813 ft: 14191 corp: 22/228b lim: 40 exec/s: 31 rss: 68Mb L: 8/19 MS: 1 ChangeByte- 00:07:54.332 [2024-11-08 04:51:29.349855] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0acb9b55 cdw11:19679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.349884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.332 #32 NEW cov: 11813 ft: 14276 corp: 23/237b lim: 40 exec/s: 32 rss: 68Mb L: 9/19 MS: 1 PersAutoDict- DE: "\313\233U\031g\226\203\000"- 00:07:54.332 [2024-11-08 04:51:29.400133] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:96836796 cdw11:83000a6a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.400162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.332 [2024-11-08 04:51:29.400209] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:a8a8a8a8 cdw11:a8a8a8a8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.400225] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.332 [2024-11-08 04:51:29.400255] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:6 nsid:0 cdw10:a8a8a8a8 cdw11:a8a8a8a8 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.332 [2024-11-08 04:51:29.400269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:54.332 [2024-11-08 04:51:29.400298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:7 nsid:0 cdw10:a8a8a8a8 cdw11:a8a89683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.333 [2024-11-08 04:51:29.400313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:54.591 #38 NEW cov: 11813 ft: 14629 corp: 24/273b lim: 40 exec/s: 38 rss: 68Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:07:54.591 [2024-11-08 04:51:29.470150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:f6626973 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.591 [2024-11-08 04:51:29.470179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.591 #39 NEW cov: 11813 ft: 14692 corp: 25/281b lim: 40 exec/s: 39 rss: 68Mb L: 8/36 MS: 1 ChangeBinInt- 00:07:54.591 [2024-11-08 04:51:29.520349] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:2a679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.520384] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.592 [2024-11-08 04:51:29.520418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:000a6a96 cdw11:83679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.520433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.592 #40 NEW cov: 11813 ft: 14770 corp: 26/298b lim: 40 exec/s: 40 rss: 68Mb L: 17/36 MS: 1 CrossOver- 00:07:54.592 [2024-11-08 04:51:29.580416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:67085500 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.580444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.592 #41 NEW cov: 11813 ft: 14784 corp: 27/306b lim: 40 exec/s: 41 rss: 68Mb L: 8/36 MS: 1 ChangeBinInt- 00:07:54.592 [2024-11-08 04:51:29.630645] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a100000 cdw11:00679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.630673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.592 [2024-11-08 04:51:29.630722] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00679683 cdw11:67968300 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.630737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.592 #42 NEW cov: 11813 ft: 14790 corp: 28/322b lim: 40 exec/s: 42 rss: 68Mb L: 16/36 MS: 1 ChangeBinInt- 00:07:54.592 [2024-11-08 04:51:29.700811] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:cb9b5519 cdw11:67679683 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.592 [2024-11-08 04:51:29.700841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.850 #43 NEW cov: 11820 ft: 14850 corp: 29/331b lim: 40 exec/s: 43 rss: 68Mb L: 9/36 MS: 1 CopyPart- 00:07:54.850 [2024-11-08 04:51:29.750933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:0a000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.850 [2024-11-08 04:51:29.750962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.850 [2024-11-08 04:51:29.750993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.850 [2024-11-08 04:51:29.751008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:54.851 #46 NEW cov: 11820 ft: 14866 corp: 30/350b lim: 40 exec/s: 46 rss: 68Mb L: 19/36 MS: 3 CrossOver-ShuffleBytes-InsertRepeatedBytes- 00:07:54.851 [2024-11-08 04:51:29.811062] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: SECURITY SEND (81) qid:0 cid:4 nsid:0 cdw10:00000000 cdw11:00200000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:54.851 [2024-11-08 04:51:29.811091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:54.851 #47 NEW cov: 11820 ft: 14876 corp: 31/363b lim: 40 exec/s: 23 rss: 69Mb L: 13/36 MS: 1 ChangeBit- 00:07:54.851 #47 DONE cov: 11820 ft: 14876 corp: 31/363b lim: 40 exec/s: 23 rss: 69Mb 00:07:54.851 ###### Recommended dictionary. ###### 00:07:54.851 "\313\233U\031g\226\203\000" # Uses: 5 00:07:54.851 ###### End of recommended dictionary. ###### 00:07:54.851 Done 47 runs in 2 second(s) 00:07:55.109 04:51:29 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_11.conf 00:07:55.109 04:51:29 -- ../common.sh@72 -- # (( i++ )) 00:07:55.109 04:51:29 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:55.109 04:51:29 -- ../common.sh@73 -- # start_llvm_fuzz 12 1 0x1 00:07:55.109 04:51:29 -- nvmf/run.sh@23 -- # local fuzzer_type=12 00:07:55.109 04:51:29 -- nvmf/run.sh@24 -- # local timen=1 00:07:55.109 04:51:29 -- nvmf/run.sh@25 -- # local core=0x1 00:07:55.109 04:51:29 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:55.109 04:51:29 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_12.conf 00:07:55.109 04:51:29 -- nvmf/run.sh@29 -- # printf %02d 12 00:07:55.109 04:51:29 -- nvmf/run.sh@29 -- # port=4412 00:07:55.109 04:51:29 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:55.109 04:51:29 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' 00:07:55.109 04:51:29 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4412"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:55.109 04:51:29 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4412' -c /tmp/fuzz_json_12.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 -Z 12 -r /var/tmp/spdk12.sock 00:07:55.109 [2024-11-08 04:51:30.013444] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:55.109 [2024-11-08 04:51:30.013554] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3684091 ] 00:07:55.109 EAL: No free 2048 kB hugepages reported on node 1 00:07:55.109 [2024-11-08 04:51:30.194385] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:55.369 [2024-11-08 04:51:30.263144] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:55.369 [2024-11-08 04:51:30.263287] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.369 [2024-11-08 04:51:30.322168] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:55.369 [2024-11-08 04:51:30.338491] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4412 *** 00:07:55.369 INFO: Running with entropic power schedule (0xFF, 100). 00:07:55.369 INFO: Seed: 862978071 00:07:55.369 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:55.369 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:55.369 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_12 00:07:55.369 INFO: A corpus is not provided, starting from an empty corpus 00:07:55.369 #2 INITED exec/s: 0 rss: 60Mb 00:07:55.369 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:55.369 This may also happen if the target rejected all inputs we tried so far 00:07:55.369 [2024-11-08 04:51:30.405305] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.369 [2024-11-08 04:51:30.405341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.369 [2024-11-08 04:51:30.405461] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.369 [2024-11-08 04:51:30.405478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.369 [2024-11-08 04:51:30.405589] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.369 [2024-11-08 04:51:30.405607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.369 [2024-11-08 04:51:30.405723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.369 [2024-11-08 04:51:30.405740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.628 NEW_FUNC[1/671]: 0x44b168 in fuzz_admin_directive_send_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:241 00:07:55.628 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:55.628 #8 NEW cov: 11578 ft: 11591 corp: 2/34b lim: 40 exec/s: 0 rss: 67Mb L: 33/33 MS: 1 InsertRepeatedBytes- 00:07:55.629 [2024-11-08 04:51:30.726436] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-11-08 04:51:30.726480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.629 [2024-11-08 04:51:30.726629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-11-08 04:51:30.726650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.629 [2024-11-08 04:51:30.726788] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-11-08 04:51:30.726811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.629 [2024-11-08 04:51:30.726952] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.629 [2024-11-08 04:51:30.726973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.888 #9 NEW cov: 11703 ft: 12249 corp: 3/68b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CrossOver- 00:07:55.888 [2024-11-08 04:51:30.786506] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.786540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.786672] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7fffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.786690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.786818] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.786835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.786973] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.786991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.888 #10 NEW cov: 11709 ft: 12470 corp: 4/101b lim: 40 exec/s: 0 rss: 67Mb L: 33/34 MS: 1 ChangeBit- 00:07:55.888 [2024-11-08 04:51:30.836622] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.836651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.836785] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.836803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.836941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.836959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.837096] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.837113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.888 #11 NEW cov: 11794 ft: 12645 corp: 5/135b lim: 40 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 CMP- DE: "\000\203\226h\322\362o\234"- 00:07:55.888 [2024-11-08 04:51:30.896813] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.896841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.896980] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7ffffffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.896999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.897125] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.897142] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.897278] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.897295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.888 #12 NEW cov: 11794 ft: 12783 corp: 6/168b lim: 40 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 ChangeBit- 00:07:55.888 [2024-11-08 04:51:30.946963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff240a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.946989] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.947129] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.947146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.947269] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.947286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:55.888 [2024-11-08 04:51:30.947416] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:55.888 [2024-11-08 04:51:30.947435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:55.888 #18 NEW cov: 11794 ft: 12917 corp: 7/202b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeByte- 00:07:56.148 [2024-11-08 04:51:30.997185] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:30.997212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:30.997348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:30.997365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:30.997521] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:30.997543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:30.997668] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:30.997685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.148 #19 NEW cov: 11794 ft: 12970 corp: 8/235b lim: 40 exec/s: 0 rss: 68Mb L: 33/34 MS: 1 ShuffleBytes- 00:07:56.148 [2024-11-08 04:51:31.047319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff240a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.047346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.047487] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.047505] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.047643] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.047660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.047787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.047805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.148 #20 NEW cov: 11794 ft: 12992 corp: 9/269b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 CrossOver- 00:07:56.148 [2024-11-08 04:51:31.097478] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.097507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.097629] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.097647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.097781] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.097797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.097936] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.097953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.148 #21 NEW cov: 11794 ft: 13008 corp: 10/303b lim: 40 exec/s: 0 rss: 68Mb L: 34/34 MS: 1 ChangeByte- 00:07:56.148 [2024-11-08 04:51:31.147352] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affff0a cdw11:ff008332 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.147380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.147528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:68d2f26f cdw11:9cffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.147547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.147683] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.147699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.148 #22 NEW cov: 11794 ft: 13341 corp: 11/333b lim: 40 exec/s: 0 rss: 68Mb L: 30/34 MS: 1 EraseBytes- 00:07:56.148 [2024-11-08 04:51:31.207764] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.207792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.207933] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.207951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.208086] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.208104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.148 [2024-11-08 04:51:31.208238] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.148 [2024-11-08 04:51:31.208256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.148 #24 NEW cov: 11794 ft: 13372 corp: 12/369b lim: 40 exec/s: 0 rss: 68Mb L: 36/36 MS: 2 CrossOver-InsertRepeatedBytes- 00:07:56.408 [2024-11-08 04:51:31.258102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.258131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.258264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7fffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.258283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.258419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.258437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.258553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.258571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.409 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:56.409 #25 NEW cov: 11817 ft: 13483 corp: 13/402b lim: 40 exec/s: 0 rss: 68Mb L: 33/36 MS: 1 ShuffleBytes- 00:07:56.409 [2024-11-08 04:51:31.308392] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affbebe cdw11:bebebebe SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.308420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.308553] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff0aff00 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.308571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.308705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:833268d2 cdw11:f26f9cff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.308725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.308858] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.308876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.309014] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:8 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.309031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:56.409 #26 NEW cov: 11817 ft: 13645 corp: 14/442b lim: 40 exec/s: 0 rss: 68Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:07:56.409 [2024-11-08 04:51:31.358393] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff240a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.358420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.358557] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.358576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.358708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cfffff8 cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.358725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.358859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.358880] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.409 #27 NEW cov: 11817 ft: 13659 corp: 15/476b lim: 40 exec/s: 27 rss: 68Mb L: 34/40 MS: 1 ChangeBinInt- 00:07:56.409 [2024-11-08 04:51:31.418560] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.418588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.418732] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7ffffffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.418753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.418895] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.418912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.419068] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:30ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.419086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.409 #28 NEW cov: 11817 ft: 13669 corp: 16/509b lim: 40 exec/s: 28 rss: 68Mb L: 33/40 MS: 1 ChangeByte- 00:07:56.409 [2024-11-08 04:51:31.478750] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.478780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.478930] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.478949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.479090] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cff21ff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.479110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.409 [2024-11-08 04:51:31.479244] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.409 [2024-11-08 04:51:31.479261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.409 #29 NEW cov: 11817 ft: 13678 corp: 17/544b lim: 40 exec/s: 29 rss: 68Mb L: 35/40 MS: 1 InsertByte- 00:07:56.669 [2024-11-08 04:51:31.528898] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.528927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.529051] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7ffffffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.529071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.529201] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.529220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.529355] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.529377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.669 #30 NEW cov: 11817 ft: 13683 corp: 18/582b lim: 40 exec/s: 30 rss: 68Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:07:56.669 [2024-11-08 04:51:31.578938] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.578972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.579108] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.579125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.579264] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.579282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.579419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffff76 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.579437] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.669 #31 NEW cov: 11817 ft: 13693 corp: 19/621b lim: 40 exec/s: 31 rss: 68Mb L: 39/40 MS: 1 InsertRepeatedBytes- 00:07:56.669 [2024-11-08 04:51:31.629261] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.629290] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.629439] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.629457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.629592] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.629609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.629747] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.629767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.669 #32 NEW cov: 11817 ft: 13705 corp: 20/655b lim: 40 exec/s: 32 rss: 68Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:56.669 [2024-11-08 04:51:31.679375] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.679404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.679548] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.679565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.679694] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.679713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.669 [2024-11-08 04:51:31.679844] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffff76 cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.679861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.669 #33 NEW cov: 11817 ft: 13711 corp: 21/694b lim: 40 exec/s: 33 rss: 68Mb L: 39/40 MS: 1 ChangeBit- 00:07:56.669 [2024-11-08 04:51:31.738528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff02ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.669 [2024-11-08 04:51:31.738556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.669 #38 NEW cov: 11817 ft: 14476 corp: 22/706b lim: 40 exec/s: 38 rss: 68Mb L: 12/40 MS: 5 ChangeByte-InsertByte-InsertByte-InsertByte-CMP- DE: "\377\377\377\377\377\377\002\377"- 00:07:56.929 [2024-11-08 04:51:31.789124] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:e6ffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.789153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.789292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff7fffff cdw11:fdffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.789310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.929 #41 NEW cov: 11817 ft: 14690 corp: 23/724b lim: 40 exec/s: 41 rss: 68Mb L: 18/40 MS: 3 CopyPart-ChangeByte-CrossOver- 00:07:56.929 [2024-11-08 04:51:31.839894] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.839923] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.840071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.840089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.840225] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:76767676 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.840243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.840372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:76ffffff cdw11:fffffffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.840390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.929 #42 NEW cov: 11817 ft: 14771 corp: 24/756b lim: 40 exec/s: 42 rss: 69Mb L: 32/40 MS: 1 EraseBytes- 00:07:56.929 [2024-11-08 04:51:31.900095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.900123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.900267] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d25f5f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.900284] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.900425] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:5f5ff26f cdw11:9cffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.900445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:31.900591] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.900611] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.929 #43 NEW cov: 11817 ft: 14783 corp: 25/794b lim: 40 exec/s: 43 rss: 69Mb L: 38/40 MS: 1 InsertRepeatedBytes- 00:07:56.929 [2024-11-08 04:51:31.949309] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff02ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:31.949337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.929 #44 NEW cov: 11817 ft: 14798 corp: 26/806b lim: 40 exec/s: 44 rss: 69Mb L: 12/40 MS: 1 ChangeByte- 00:07:56.929 [2024-11-08 04:51:32.010424] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:32.010452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:32.010599] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:32.010618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:32.010759] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:32.010778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:56.929 [2024-11-08 04:51:32.010920] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:00000000 cdw11:f7ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:56.929 [2024-11-08 04:51:32.010939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:56.929 #45 NEW cov: 11817 ft: 14808 corp: 27/839b lim: 40 exec/s: 45 rss: 69Mb L: 33/40 MS: 1 ChangeBinInt- 00:07:57.188 [2024-11-08 04:51:32.070651] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff240a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.070678] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.070812] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.070830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.070963] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.070980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.071115] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffff83 cdw11:3268d2ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.071131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.188 #46 NEW cov: 11817 ft: 14818 corp: 28/873b lim: 40 exec/s: 46 rss: 69Mb L: 34/40 MS: 1 CrossOver- 00:07:57.188 [2024-11-08 04:51:32.119876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffff02ff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.119902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.188 #47 NEW cov: 11817 ft: 14823 corp: 29/885b lim: 40 exec/s: 47 rss: 69Mb L: 12/40 MS: 1 CrossOver- 00:07:57.188 [2024-11-08 04:51:32.169972] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ff02ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.169999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.188 #48 NEW cov: 11817 ft: 14839 corp: 30/897b lim: 40 exec/s: 48 rss: 69Mb L: 12/40 MS: 1 ShuffleBytes- 00:07:57.188 [2024-11-08 04:51:32.220942] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.220969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.221110] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68d2f26f SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.221129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.221260] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:9cffffff cdw11:76767776 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.221278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.221413] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:76ffffff cdw11:fffffffd SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.221429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.188 #49 NEW cov: 11817 ft: 14853 corp: 31/929b lim: 40 exec/s: 49 rss: 69Mb L: 32/40 MS: 1 ChangeBit- 00:07:57.188 [2024-11-08 04:51:32.280874] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffff0a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.280901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.281039] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008332 cdw11:68ffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.281057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.188 [2024-11-08 04:51:32.281200] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.188 [2024-11-08 04:51:32.281219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.447 #50 NEW cov: 11817 ft: 14902 corp: 32/953b lim: 40 exec/s: 50 rss: 69Mb L: 24/40 MS: 1 EraseBytes- 00:07:57.447 [2024-11-08 04:51:32.341539] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.447 [2024-11-08 04:51:32.341566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.341705] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:7f25ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.341722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.341876] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.341897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.342045] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.342063] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.448 #51 NEW cov: 11817 ft: 14921 corp: 33/987b lim: 40 exec/s: 51 rss: 69Mb L: 34/40 MS: 1 InsertByte- 00:07:57.448 [2024-11-08 04:51:32.391632] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:4 nsid:0 cdw10:0affffff cdw11:ffff240a SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.391660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.391793] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:5 nsid:0 cdw10:ff008396 cdw11:68d2ffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.391811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.391945] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:6 nsid:0 cdw10:f26f9cff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.391962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.448 [2024-11-08 04:51:32.392102] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE SEND (19) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:07:57.448 [2024-11-08 04:51:32.392119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:57.448 #52 NEW cov: 11817 ft: 14938 corp: 34/1021b lim: 40 exec/s: 26 rss: 69Mb L: 34/40 MS: 1 ShuffleBytes- 00:07:57.448 #52 DONE cov: 11817 ft: 14938 corp: 34/1021b lim: 40 exec/s: 26 rss: 69Mb 00:07:57.448 ###### Recommended dictionary. ###### 00:07:57.448 "\000\203\226h\322\362o\234" # Uses: 0 00:07:57.448 "\377\377\377\377\377\377\002\377" # Uses: 0 00:07:57.448 ###### End of recommended dictionary. ###### 00:07:57.448 Done 52 runs in 2 second(s) 00:07:57.448 04:51:32 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_12.conf 00:07:57.448 04:51:32 -- ../common.sh@72 -- # (( i++ )) 00:07:57.448 04:51:32 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:07:57.448 04:51:32 -- ../common.sh@73 -- # start_llvm_fuzz 13 1 0x1 00:07:57.448 04:51:32 -- nvmf/run.sh@23 -- # local fuzzer_type=13 00:07:57.448 04:51:32 -- nvmf/run.sh@24 -- # local timen=1 00:07:57.448 04:51:32 -- nvmf/run.sh@25 -- # local core=0x1 00:07:57.448 04:51:32 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:57.448 04:51:32 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_13.conf 00:07:57.448 04:51:32 -- nvmf/run.sh@29 -- # printf %02d 13 00:07:57.448 04:51:32 -- nvmf/run.sh@29 -- # port=4413 00:07:57.448 04:51:32 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:57.448 04:51:32 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' 00:07:57.448 04:51:32 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4413"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:07:57.448 04:51:32 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4413' -c /tmp/fuzz_json_13.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 -Z 13 -r /var/tmp/spdk13.sock 00:07:57.707 [2024-11-08 04:51:32.576123] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:07:57.707 [2024-11-08 04:51:32.576187] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3684621 ] 00:07:57.707 EAL: No free 2048 kB hugepages reported on node 1 00:07:57.707 [2024-11-08 04:51:32.751612] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:57.707 [2024-11-08 04:51:32.815118] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:07:57.707 [2024-11-08 04:51:32.815244] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:07:57.967 [2024-11-08 04:51:32.873407] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:57.967 [2024-11-08 04:51:32.889711] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4413 *** 00:07:57.967 INFO: Running with entropic power schedule (0xFF, 100). 00:07:57.967 INFO: Seed: 3415962512 00:07:57.967 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:07:57.967 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:07:57.967 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_13 00:07:57.967 INFO: A corpus is not provided, starting from an empty corpus 00:07:57.967 #2 INITED exec/s: 0 rss: 60Mb 00:07:57.967 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:07:57.967 This may also happen if the target rejected all inputs we tried so far 00:07:57.967 [2024-11-08 04:51:32.934507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.967 [2024-11-08 04:51:32.934548] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:57.967 [2024-11-08 04:51:32.934582] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.967 [2024-11-08 04:51:32.934598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:57.967 [2024-11-08 04:51:32.934627] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.967 [2024-11-08 04:51:32.934642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:57.967 [2024-11-08 04:51:32.934671] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:57.967 [2024-11-08 04:51:32.934686] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.227 NEW_FUNC[1/670]: 0x44cd38 in fuzz_admin_directive_receive_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:257 00:07:58.227 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:07:58.227 #8 NEW cov: 11578 ft: 11579 corp: 2/35b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:07:58.227 [2024-11-08 04:51:33.255239] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.255278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.255327] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.255342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.255372] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.255390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.255419] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff09ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.255434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.227 #9 NEW cov: 11691 ft: 12192 corp: 3/69b lim: 40 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:07:58.227 [2024-11-08 04:51:33.325402] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.325434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.325467] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.325483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.325512] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.325533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.325563] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.325578] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.227 [2024-11-08 04:51:33.325606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.227 [2024-11-08 04:51:33.325621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.487 #14 NEW cov: 11697 ft: 12456 corp: 4/109b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 5 CopyPart-CrossOver-ChangeBit-ShuffleBytes-InsertRepeatedBytes- 00:07:58.487 [2024-11-08 04:51:33.375446] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.375476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.375533] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.375549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.375579] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.375595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.375624] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.375639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.487 #15 NEW cov: 11782 ft: 12745 corp: 5/143b lim: 40 exec/s: 0 rss: 69Mb L: 34/40 MS: 1 CopyPart- 00:07:58.487 [2024-11-08 04:51:33.435700] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.435736] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.435770] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.435789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.435820] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.435836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.435866] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.435882] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.435911] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.435926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.487 #16 NEW cov: 11782 ft: 12863 corp: 6/183b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CrossOver- 00:07:58.487 [2024-11-08 04:51:33.495774] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffff27 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.495804] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.495852] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.495868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.495897] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.495912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.495941] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ff09ffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.495956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.487 #17 NEW cov: 11782 ft: 12922 corp: 7/217b lim: 40 exec/s: 0 rss: 69Mb L: 34/40 MS: 1 ChangeByte- 00:07:58.487 [2024-11-08 04:51:33.555924] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.555953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.556002] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.556018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.556047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.556066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.487 [2024-11-08 04:51:33.556095] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.487 [2024-11-08 04:51:33.556109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.747 #18 NEW cov: 11782 ft: 13014 corp: 8/255b lim: 40 exec/s: 0 rss: 69Mb L: 38/40 MS: 1 CMP- DE: "\000\000\000\000"- 00:07:58.747 [2024-11-08 04:51:33.626150] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.626180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.626213] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.626228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.626257] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.626272] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.626301] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.626315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.747 #19 NEW cov: 11782 ft: 13049 corp: 9/293b lim: 40 exec/s: 0 rss: 69Mb L: 38/40 MS: 1 ChangeBinInt- 00:07:58.747 [2024-11-08 04:51:33.686299] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.686328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.686376] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.686391] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.686420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.686436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.686464] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.686479] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.686507] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.686528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:58.747 #20 NEW cov: 11782 ft: 13112 corp: 10/333b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 CopyPart- 00:07:58.747 [2024-11-08 04:51:33.746319] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.746348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.746395] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.746411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.747 #21 NEW cov: 11782 ft: 13719 corp: 11/353b lim: 40 exec/s: 0 rss: 69Mb L: 20/40 MS: 1 EraseBytes- 00:07:58.747 [2024-11-08 04:51:33.806544] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.806573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.806620] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.806636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.806665] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.806680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:58.747 [2024-11-08 04:51:33.806708] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:58.747 [2024-11-08 04:51:33.806723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:58.747 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:07:58.747 #22 NEW cov: 11805 ft: 13752 corp: 12/391b lim: 40 exec/s: 0 rss: 70Mb L: 38/40 MS: 1 CopyPart- 00:07:59.055 [2024-11-08 04:51:33.856738] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.856770] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.856804] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.856820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.856849] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.856865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.856896] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.856911] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.055 #23 NEW cov: 11805 ft: 13769 corp: 13/429b lim: 40 exec/s: 0 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:07:59.055 [2024-11-08 04:51:33.906824] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff050000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.906858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.906905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.906921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.906949] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.906964] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.906993] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.907008] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.055 #24 NEW cov: 11805 ft: 13794 corp: 14/467b lim: 40 exec/s: 24 rss: 70Mb L: 38/40 MS: 1 ChangeBinInt- 00:07:59.055 [2024-11-08 04:51:33.967836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.967862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.967918] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.967932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.967988] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.968000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:33.968054] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:33.968067] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.055 #25 NEW cov: 11805 ft: 13895 corp: 15/505b lim: 40 exec/s: 25 rss: 70Mb L: 38/40 MS: 1 ShuffleBytes- 00:07:59.055 [2024-11-08 04:51:34.008085] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:34.008110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:34.008166] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:34.008180] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:34.008236] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:34.008249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:34.008304] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:34.008320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.055 [2024-11-08 04:51:34.008374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.055 [2024-11-08 04:51:34.008387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.055 #26 NEW cov: 11805 ft: 13918 corp: 16/545b lim: 40 exec/s: 26 rss: 70Mb L: 40/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:59.056 [2024-11-08 04:51:34.048066] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffbc cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.048091] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.048165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.048179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.048234] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.048247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.048302] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.048315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.056 #27 NEW cov: 11805 ft: 13950 corp: 17/583b lim: 40 exec/s: 27 rss: 70Mb L: 38/40 MS: 1 ChangeByte- 00:07:59.056 [2024-11-08 04:51:34.088183] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.088208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.088263] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.088277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.088348] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.088361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.088417] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.088430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.056 #28 NEW cov: 11805 ft: 13982 corp: 18/621b lim: 40 exec/s: 28 rss: 70Mb L: 38/40 MS: 1 ChangeBinInt- 00:07:59.056 [2024-11-08 04:51:34.128271] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.128298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.128354] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.128374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.128429] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.128442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.056 [2024-11-08 04:51:34.128498] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.056 [2024-11-08 04:51:34.128511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.056 #29 NEW cov: 11805 ft: 14020 corp: 19/656b lim: 40 exec/s: 29 rss: 70Mb L: 35/40 MS: 1 CrossOver- 00:07:59.316 [2024-11-08 04:51:34.168418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.168444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.168501] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.168515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.168578] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.168591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.168648] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:000000ff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.168661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 #35 NEW cov: 11805 ft: 14035 corp: 20/694b lim: 40 exec/s: 35 rss: 70Mb L: 38/40 MS: 1 PersAutoDict- DE: "\000\000\000\000"- 00:07:59.316 [2024-11-08 04:51:34.208526] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.208550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.208606] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.208620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.208675] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.208688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.208744] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.208757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 #36 NEW cov: 11805 ft: 14058 corp: 21/729b lim: 40 exec/s: 36 rss: 70Mb L: 35/40 MS: 1 EraseBytes- 00:07:59.316 [2024-11-08 04:51:34.248755] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.248779] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.248836] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.248849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.248905] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.248918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.248974] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7caf7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.248987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.249043] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.249056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.316 #37 NEW cov: 11805 ft: 14074 corp: 22/769b lim: 40 exec/s: 37 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:59.316 [2024-11-08 04:51:34.288875] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.288900] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.288958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.288972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.289029] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.289042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.289097] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.289111] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.289165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.289178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.316 #38 NEW cov: 11805 ft: 14116 corp: 23/809b lim: 40 exec/s: 38 rss: 70Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:59.316 [2024-11-08 04:51:34.328958] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.328983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.329040] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.329057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.329112] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.329125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.329182] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:af7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.329195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.329251] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.329263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.316 #39 NEW cov: 11805 ft: 14160 corp: 24/849b lim: 40 exec/s: 39 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:59.316 [2024-11-08 04:51:34.368997] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff7fffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.369022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.369079] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.369092] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.369165] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.369179] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.369235] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.369249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.316 #40 NEW cov: 11805 ft: 14174 corp: 25/887b lim: 40 exec/s: 40 rss: 70Mb L: 38/40 MS: 1 ChangeBit- 00:07:59.316 [2024-11-08 04:51:34.409063] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.409088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.409144] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.409157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.409215] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.316 [2024-11-08 04:51:34.409229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.316 [2024-11-08 04:51:34.409284] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffff00 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.317 [2024-11-08 04:51:34.409300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.576 #41 NEW cov: 11805 ft: 14201 corp: 26/922b lim: 40 exec/s: 41 rss: 70Mb L: 35/40 MS: 1 ChangeBit- 00:07:59.576 [2024-11-08 04:51:34.449338] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.576 [2024-11-08 04:51:34.449363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.576 [2024-11-08 04:51:34.449420] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.576 [2024-11-08 04:51:34.449434] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.449490] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c327c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.449502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.449566] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:af7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.449579] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.449635] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.449648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.577 #42 NEW cov: 11805 ft: 14211 corp: 27/962b lim: 40 exec/s: 42 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:59.577 [2024-11-08 04:51:34.489047] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.489071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.489128] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.489141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 #43 NEW cov: 11805 ft: 14248 corp: 28/982b lim: 40 exec/s: 43 rss: 70Mb L: 20/40 MS: 1 ChangeBinInt- 00:07:59.577 [2024-11-08 04:51:34.529545] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff010000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.529569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.529640] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00ffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.529654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.529712] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00ffff00 cdw11:000000ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.529726] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.529787] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.529800] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.529857] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:00000000 cdw11:ffffff0a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.529870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.577 #44 NEW cov: 11805 ft: 14251 corp: 29/1022b lim: 40 exec/s: 44 rss: 70Mb L: 40/40 MS: 1 CopyPart- 00:07:59.577 [2024-11-08 04:51:34.569641] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.569667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.569723] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.569737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.569792] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.569805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.569859] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:af7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.569872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.569925] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.569939] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.577 #45 NEW cov: 11805 ft: 14256 corp: 30/1062b lim: 40 exec/s: 45 rss: 70Mb L: 40/40 MS: 1 ShuffleBytes- 00:07:59.577 [2024-11-08 04:51:34.609334] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.609359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.609418] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.609432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 #46 NEW cov: 11805 ft: 14274 corp: 31/1083b lim: 40 exec/s: 46 rss: 70Mb L: 21/40 MS: 1 CrossOver- 00:07:59.577 [2024-11-08 04:51:34.649734] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ff7fffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.649759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.649819] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ff040000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.649833] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.649907] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00ffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.649921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.577 [2024-11-08 04:51:34.649981] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffff0000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.577 [2024-11-08 04:51:34.649994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.577 #47 NEW cov: 11805 ft: 14316 corp: 32/1121b lim: 40 exec/s: 47 rss: 70Mb L: 38/40 MS: 1 CrossOver- 00:07:59.837 [2024-11-08 04:51:34.690004] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.690029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.690087] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:0a7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.690100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.690156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c25 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.690170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.690224] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.690238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.690292] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.690305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.837 #48 NEW cov: 11805 ft: 14321 corp: 33/1161b lim: 40 exec/s: 48 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:07:59.837 [2024-11-08 04:51:34.729697] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.729722] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.729782] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.729796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.837 #49 NEW cov: 11805 ft: 14453 corp: 34/1182b lim: 40 exec/s: 49 rss: 70Mb L: 21/40 MS: 1 InsertByte- 00:07:59.837 [2024-11-08 04:51:34.770057] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.770083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.770156] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.770172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.770228] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.770242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.770298] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.770311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.837 #50 NEW cov: 11805 ft: 14463 corp: 35/1220b lim: 40 exec/s: 50 rss: 70Mb L: 38/40 MS: 1 EraseBytes- 00:07:59.837 [2024-11-08 04:51:34.810291] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:7c7c7c00 cdw11:0000007c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.810316] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.810374] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:7c7c7c7c cdw11:7c7c7c78 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.810387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.810445] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:7c7c7c7c cdw11:7c327c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.810459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.810513] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:7c7c7c7c cdw11:af7c7c7c SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.810530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.810588] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:8 nsid:0 cdw10:7c7c7c7c cdw11:7c7c0a0e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.810600] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:07:59.837 #51 NEW cov: 11805 ft: 14509 corp: 36/1260b lim: 40 exec/s: 51 rss: 70Mb L: 40/40 MS: 1 ChangeBit- 00:07:59.837 [2024-11-08 04:51:34.850071] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.850096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.850180] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:fffffff7 cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.850195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.837 #52 NEW cov: 11805 ft: 14533 corp: 37/1281b lim: 40 exec/s: 52 rss: 70Mb L: 21/40 MS: 1 ChangeBinInt- 00:07:59.837 [2024-11-08 04:51:34.890472] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:ffffffff cdw11:ffffffcd SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.837 [2024-11-08 04:51:34.890497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.837 [2024-11-08 04:51:34.890559] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:ffffffff cdw11:ffcdffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.890575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.838 [2024-11-08 04:51:34.890633] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.890647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:07:59.838 [2024-11-08 04:51:34.890703] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:7 nsid:0 cdw10:ffffffff cdw11:ffffffff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.890715] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:07:59.838 #53 NEW cov: 11805 ft: 14540 corp: 38/1315b lim: 40 exec/s: 53 rss: 70Mb L: 34/40 MS: 1 CopyPart- 00:07:59.838 [2024-11-08 04:51:34.930440] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:4 nsid:0 cdw10:13000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.930465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:07:59.838 [2024-11-08 04:51:34.930528] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:5 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.930542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:07:59.838 [2024-11-08 04:51:34.930614] nvme_qpair.c: 225:nvme_admin_qpair_print_command: *NOTICE*: DIRECTIVE RECEIVE (1a) qid:0 cid:6 nsid:0 cdw10:00000000 cdw11:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:07:59.838 [2024-11-08 04:51:34.930627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.098 #57 NEW cov: 11805 ft: 14715 corp: 39/1345b lim: 40 exec/s: 28 rss: 70Mb L: 30/40 MS: 4 ChangeBinInt-InsertByte-EraseBytes-InsertRepeatedBytes- 00:08:00.098 #57 DONE cov: 11805 ft: 14715 corp: 39/1345b lim: 40 exec/s: 28 rss: 70Mb 00:08:00.098 ###### Recommended dictionary. ###### 00:08:00.098 "\000\000\000\000" # Uses: 2 00:08:00.098 ###### End of recommended dictionary. ###### 00:08:00.098 Done 57 runs in 2 second(s) 00:08:00.098 04:51:35 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_13.conf 00:08:00.098 04:51:35 -- ../common.sh@72 -- # (( i++ )) 00:08:00.098 04:51:35 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:00.098 04:51:35 -- ../common.sh@73 -- # start_llvm_fuzz 14 1 0x1 00:08:00.098 04:51:35 -- nvmf/run.sh@23 -- # local fuzzer_type=14 00:08:00.098 04:51:35 -- nvmf/run.sh@24 -- # local timen=1 00:08:00.098 04:51:35 -- nvmf/run.sh@25 -- # local core=0x1 00:08:00.098 04:51:35 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:00.098 04:51:35 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_14.conf 00:08:00.098 04:51:35 -- nvmf/run.sh@29 -- # printf %02d 14 00:08:00.098 04:51:35 -- nvmf/run.sh@29 -- # port=4414 00:08:00.098 04:51:35 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:00.098 04:51:35 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' 00:08:00.098 04:51:35 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4414"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:00.098 04:51:35 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4414' -c /tmp/fuzz_json_14.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 -Z 14 -r /var/tmp/spdk14.sock 00:08:00.098 [2024-11-08 04:51:35.117015] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:00.098 [2024-11-08 04:51:35.117085] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3685150 ] 00:08:00.098 EAL: No free 2048 kB hugepages reported on node 1 00:08:00.357 [2024-11-08 04:51:35.295894] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:00.357 [2024-11-08 04:51:35.358672] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:00.357 [2024-11-08 04:51:35.358813] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.357 [2024-11-08 04:51:35.416809] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:00.357 [2024-11-08 04:51:35.433116] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4414 *** 00:08:00.357 INFO: Running with entropic power schedule (0xFF, 100). 00:08:00.357 INFO: Seed: 1664002020 00:08:00.357 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:00.357 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:00.357 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_14 00:08:00.357 INFO: A corpus is not provided, starting from an empty corpus 00:08:00.357 #2 INITED exec/s: 0 rss: 60Mb 00:08:00.357 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:00.357 This may also happen if the target rejected all inputs we tried so far 00:08:00.616 [2024-11-08 04:51:35.488901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.616 [2024-11-08 04:51:35.488932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.616 [2024-11-08 04:51:35.488991] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.616 [2024-11-08 04:51:35.489004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.616 [2024-11-08 04:51:35.489061] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.616 [2024-11-08 04:51:35.489074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.876 NEW_FUNC[1/673]: 0x44e908 in fuzz_admin_set_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:392 00:08:00.876 NEW_FUNC[2/673]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:00.876 #19 NEW cov: 11605 ft: 11606 corp: 2/29b lim: 35 exec/s: 0 rss: 68Mb L: 28/28 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:00.876 [2024-11-08 04:51:35.789345] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.789377] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.876 #20 NEW cov: 11718 ft: 12444 corp: 3/46b lim: 35 exec/s: 0 rss: 69Mb L: 17/28 MS: 1 CrossOver- 00:08:00.876 [2024-11-08 04:51:35.839721] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.839747] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.839809] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.839823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.839879] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.839893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.876 #21 NEW cov: 11724 ft: 12811 corp: 4/80b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 CrossOver- 00:08:00.876 [2024-11-08 04:51:35.879811] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.879836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.879895] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.879909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.879970] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.879983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:00.876 #32 NEW cov: 11809 ft: 13092 corp: 5/108b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:00.876 [2024-11-08 04:51:35.919760] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.919785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.919843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.919857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:00.876 [2024-11-08 04:51:35.919915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.876 [2024-11-08 04:51:35.919928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:00.876 #33 NEW cov: 11809 ft: 13283 corp: 6/130b lim: 35 exec/s: 0 rss: 69Mb L: 22/34 MS: 1 InsertRepeatedBytes- 00:08:00.877 [2024-11-08 04:51:35.959767] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:00.877 [2024-11-08 04:51:35.959792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.136 #34 NEW cov: 11809 ft: 13377 corp: 7/147b lim: 35 exec/s: 0 rss: 69Mb L: 17/34 MS: 1 CrossOver- 00:08:01.136 [2024-11-08 04:51:36.000180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.000204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.000262] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.000276] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.000335] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.000348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.136 #35 NEW cov: 11809 ft: 13439 corp: 8/175b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:01.136 [2024-11-08 04:51:36.040322] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.040347] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.040410] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.040423] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.040496] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.040510] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.136 #36 NEW cov: 11809 ft: 13464 corp: 9/203b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 InsertRepeatedBytes- 00:08:01.136 [2024-11-08 04:51:36.080416] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.080440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.080499] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.080513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.080573] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.080586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.136 #37 NEW cov: 11809 ft: 13499 corp: 10/231b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:01.136 [2024-11-08 04:51:36.120513] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.120544] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.120605] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.120619] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.136 [2024-11-08 04:51:36.120680] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.136 [2024-11-08 04:51:36.120693] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.136 #38 NEW cov: 11809 ft: 13538 corp: 11/259b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 ChangeByte- 00:08:01.137 [2024-11-08 04:51:36.160630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.160656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.160716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.160730] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.160790] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.160803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.137 #39 NEW cov: 11809 ft: 13559 corp: 12/287b lim: 35 exec/s: 0 rss: 69Mb L: 28/34 MS: 1 ShuffleBytes- 00:08:01.137 [2024-11-08 04:51:36.200755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.200784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.200843] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.200857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.200915] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.200929] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.137 #45 NEW cov: 11809 ft: 13683 corp: 13/321b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:08:01.137 [2024-11-08 04:51:36.240908] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.240933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.240993] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.241007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.137 [2024-11-08 04:51:36.241066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.137 [2024-11-08 04:51:36.241080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.396 #46 NEW cov: 11809 ft: 13706 corp: 14/350b lim: 35 exec/s: 0 rss: 69Mb L: 29/34 MS: 1 InsertByte- 00:08:01.396 [2024-11-08 04:51:36.281006] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.281031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.281092] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.281105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.281161] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.281175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.396 #47 NEW cov: 11809 ft: 13756 corp: 15/383b lim: 35 exec/s: 0 rss: 69Mb L: 33/34 MS: 1 CopyPart- 00:08:01.396 [2024-11-08 04:51:36.321145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.321170] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.321228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.321242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.321314] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.321328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.396 #48 NEW cov: 11809 ft: 13795 corp: 16/417b lim: 35 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeByte- 00:08:01.396 [2024-11-08 04:51:36.361222] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:80000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.361252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.361312] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.361326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.361385] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.361398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.396 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:01.396 #49 NEW cov: 11839 ft: 13908 corp: 17/451b lim: 35 exec/s: 0 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:01.396 [2024-11-08 04:51:36.401362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.401386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.401441] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.401455] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.401512] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.401528] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.396 #50 NEW cov: 11839 ft: 13910 corp: 18/480b lim: 35 exec/s: 0 rss: 70Mb L: 29/34 MS: 1 InsertByte- 00:08:01.396 [2024-11-08 04:51:36.441453] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.441478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.441536] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.396 [2024-11-08 04:51:36.441550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.396 [2024-11-08 04:51:36.441608] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.397 [2024-11-08 04:51:36.441621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.397 #51 NEW cov: 11839 ft: 13920 corp: 19/508b lim: 35 exec/s: 51 rss: 70Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:01.397 [2024-11-08 04:51:36.481205] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.397 [2024-11-08 04:51:36.481230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.397 [2024-11-08 04:51:36.481287] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.397 [2024-11-08 04:51:36.481301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.656 #52 NEW cov: 11839 ft: 14097 corp: 20/527b lim: 35 exec/s: 52 rss: 70Mb L: 19/34 MS: 1 EraseBytes- 00:08:01.656 [2024-11-08 04:51:36.521728] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.521760] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.521820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.521834] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.521877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.521891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.656 #53 NEW cov: 11839 ft: 14118 corp: 21/555b lim: 35 exec/s: 53 rss: 70Mb L: 28/34 MS: 1 ShuffleBytes- 00:08:01.656 [2024-11-08 04:51:36.561736] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000038 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.561761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.561821] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.561835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.561893] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.561907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.561963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000db SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.561975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.656 #54 NEW cov: 11839 ft: 14328 corp: 22/584b lim: 35 exec/s: 54 rss: 70Mb L: 29/34 MS: 1 InsertByte- 00:08:01.656 [2024-11-08 04:51:36.601981] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.602006] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.602067] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.602080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.656 NEW_FUNC[1/2]: 0x4691c8 in feat_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:273 00:08:01.656 NEW_FUNC[2/2]: 0x112c368 in nvmf_ctrlr_set_features_arbitration /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1489 00:08:01.656 #55 NEW cov: 11896 ft: 14398 corp: 23/612b lim: 35 exec/s: 55 rss: 70Mb L: 28/34 MS: 1 ChangeBinInt- 00:08:01.656 [2024-11-08 04:51:36.641489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:8000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.641516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.656 #60 NEW cov: 11896 ft: 14971 corp: 24/619b lim: 35 exec/s: 60 rss: 70Mb L: 7/34 MS: 5 ChangeByte-ChangeBit-CopyPart-InsertByte-CMP- DE: "\376\377\377\377"- 00:08:01.656 [2024-11-08 04:51:36.681584] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:8000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.681614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.656 #61 NEW cov: 11896 ft: 14979 corp: 25/626b lim: 35 exec/s: 61 rss: 70Mb L: 7/34 MS: 1 CopyPart- 00:08:01.656 [2024-11-08 04:51:36.722295] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:5 cdw10:00000010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.722320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.722379] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:6 cdw10:00000010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.722393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.656 [2024-11-08 04:51:36.722450] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES HOST CONTROLLED THERMAL MANAGEMENT cid:7 cdw10:00000010 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.722464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.656 #66 NEW cov: 11896 ft: 14998 corp: 26/656b lim: 35 exec/s: 66 rss: 70Mb L: 30/34 MS: 5 CopyPart-PersAutoDict-ChangeByte-ChangeByte-InsertRepeatedBytes- DE: "\376\377\377\377"- 00:08:01.656 [2024-11-08 04:51:36.762392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.656 [2024-11-08 04:51:36.762417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.657 [2024-11-08 04:51:36.762478] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.657 [2024-11-08 04:51:36.762492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.657 [2024-11-08 04:51:36.762556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.657 [2024-11-08 04:51:36.762571] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.916 #67 NEW cov: 11896 ft: 15084 corp: 27/690b lim: 35 exec/s: 67 rss: 70Mb L: 34/34 MS: 1 ChangeBinInt- 00:08:01.916 [2024-11-08 04:51:36.802459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.802484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.802548] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.802562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.802623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000021 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.802636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.916 #68 NEW cov: 11896 ft: 15088 corp: 28/718b lim: 35 exec/s: 68 rss: 70Mb L: 28/34 MS: 1 CrossOver- 00:08:01.916 [2024-11-08 04:51:36.842660] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.842685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.842744] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.842761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.842818] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.842831] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.916 #69 NEW cov: 11896 ft: 15096 corp: 29/748b lim: 35 exec/s: 69 rss: 70Mb L: 30/34 MS: 1 InsertByte- 00:08:01.916 [2024-11-08 04:51:36.882739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.882765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.882826] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.882839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.882916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.882931] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.916 #70 NEW cov: 11896 ft: 15104 corp: 30/781b lim: 35 exec/s: 70 rss: 70Mb L: 33/34 MS: 1 CrossOver- 00:08:01.916 [2024-11-08 04:51:36.922654] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.922679] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.922739] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.922752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.922810] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:0000004c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.922823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.916 #71 NEW cov: 11896 ft: 15119 corp: 31/803b lim: 35 exec/s: 71 rss: 70Mb L: 22/34 MS: 1 ChangeBinInt- 00:08:01.916 [2024-11-08 04:51:36.962946] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.962972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.963030] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.963044] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:36.963101] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:36.963114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:01.916 #72 NEW cov: 11896 ft: 15136 corp: 32/831b lim: 35 exec/s: 72 rss: 70Mb L: 28/34 MS: 1 ChangeByte- 00:08:01.916 [2024-11-08 04:51:37.002709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:37.002734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:01.916 [2024-11-08 04:51:37.002794] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:01.916 [2024-11-08 04:51:37.002811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:01.916 #73 NEW cov: 11896 ft: 15143 corp: 33/847b lim: 35 exec/s: 73 rss: 70Mb L: 16/34 MS: 1 EraseBytes- 00:08:02.176 [2024-11-08 04:51:37.042650] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:0000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.042675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.176 #74 NEW cov: 11896 ft: 15235 corp: 34/855b lim: 35 exec/s: 74 rss: 70Mb L: 8/34 MS: 1 InsertByte- 00:08:02.176 [2024-11-08 04:51:37.083285] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.083310] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.083370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.083383] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.083440] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:000000db SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.083453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.176 #75 NEW cov: 11896 ft: 15256 corp: 35/884b lim: 35 exec/s: 75 rss: 70Mb L: 29/34 MS: 1 InsertByte- 00:08:02.176 [2024-11-08 04:51:37.123556] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.123580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.123638] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.123652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.123711] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.123724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.123783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.123796] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.176 #76 NEW cov: 11896 ft: 15295 corp: 36/918b lim: 35 exec/s: 76 rss: 70Mb L: 34/34 MS: 1 ShuffleBytes- 00:08:02.176 [2024-11-08 04:51:37.163432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.163459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.163521] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.163540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.176 #77 NEW cov: 11896 ft: 15341 corp: 37/945b lim: 35 exec/s: 77 rss: 70Mb L: 27/34 MS: 1 InsertRepeatedBytes- 00:08:02.176 [2024-11-08 04:51:37.203528] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.203558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.203618] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.203633] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.176 #78 NEW cov: 11896 ft: 15347 corp: 38/972b lim: 35 exec/s: 78 rss: 70Mb L: 27/34 MS: 1 PersAutoDict- DE: "\376\377\377\377"- 00:08:02.176 [2024-11-08 04:51:37.243355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES AUTONOMOUS POWER STATE TRANSITION cid:4 cdw10:8000000c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.243382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.176 #79 NEW cov: 11896 ft: 15350 corp: 39/979b lim: 35 exec/s: 79 rss: 70Mb L: 7/34 MS: 1 ChangeBit- 00:08:02.176 [2024-11-08 04:51:37.284013] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:0000003c SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.284038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.284099] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.284113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.176 [2024-11-08 04:51:37.284180] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.176 [2024-11-08 04:51:37.284195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.435 #80 NEW cov: 11896 ft: 15359 corp: 40/1007b lim: 35 exec/s: 80 rss: 70Mb L: 28/34 MS: 1 CMP- DE: "\262\371\017<\037\177\000\000"- 00:08:02.435 [2024-11-08 04:51:37.323899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.323925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.435 [2024-11-08 04:51:37.323985] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.324001] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.435 #81 NEW cov: 11896 ft: 15368 corp: 41/1034b lim: 35 exec/s: 81 rss: 70Mb L: 27/34 MS: 1 ChangeBinInt- 00:08:02.435 [2024-11-08 04:51:37.364057] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:4 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.364084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:02.435 [2024-11-08 04:51:37.364145] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:800000e7 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.364161] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: FEATURE ID NOT SAVEABLE (01/0d) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.435 #82 NEW cov: 11896 ft: 15374 corp: 42/1061b lim: 35 exec/s: 82 rss: 70Mb L: 27/34 MS: 1 ChangeBit- 00:08:02.435 [2024-11-08 04:51:37.404387] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.404411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.435 [2024-11-08 04:51:37.404472] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.404485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.435 #83 NEW cov: 11896 ft: 15380 corp: 43/1094b lim: 35 exec/s: 83 rss: 70Mb L: 33/34 MS: 1 CopyPart- 00:08:02.435 [2024-11-08 04:51:37.444459] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:5 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.444483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.435 [2024-11-08 04:51:37.444545] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.444559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.435 [2024-11-08 04:51:37.444635] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:7 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.444648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.435 #84 NEW cov: 11896 ft: 15387 corp: 44/1122b lim: 35 exec/s: 84 rss: 70Mb L: 28/34 MS: 1 ChangeBit- 00:08:02.435 [2024-11-08 04:51:37.484444] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: SET FEATURES RESERVED cid:6 cdw10:00000000 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:02.435 [2024-11-08 04:51:37.484469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.436 NEW_FUNC[1/2]: 0x473ca8 in feat_keep_alive_timer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:364 00:08:02.436 NEW_FUNC[2/2]: 0x1128c98 in nvmf_ctrlr_set_features_keep_alive_timer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/nvmf/ctrlr.c:1883 00:08:02.436 #85 NEW cov: 11967 ft: 15567 corp: 45/1147b lim: 35 exec/s: 42 rss: 70Mb L: 25/34 MS: 1 PersAutoDict- DE: "\262\371\017<\037\177\000\000"- 00:08:02.436 #85 DONE cov: 11967 ft: 15567 corp: 45/1147b lim: 35 exec/s: 42 rss: 70Mb 00:08:02.436 ###### Recommended dictionary. ###### 00:08:02.436 "\376\377\377\377" # Uses: 2 00:08:02.436 "\262\371\017<\037\177\000\000" # Uses: 1 00:08:02.436 ###### End of recommended dictionary. ###### 00:08:02.436 Done 85 runs in 2 second(s) 00:08:02.695 04:51:37 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_14.conf 00:08:02.695 04:51:37 -- ../common.sh@72 -- # (( i++ )) 00:08:02.695 04:51:37 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:02.695 04:51:37 -- ../common.sh@73 -- # start_llvm_fuzz 15 1 0x1 00:08:02.695 04:51:37 -- nvmf/run.sh@23 -- # local fuzzer_type=15 00:08:02.695 04:51:37 -- nvmf/run.sh@24 -- # local timen=1 00:08:02.695 04:51:37 -- nvmf/run.sh@25 -- # local core=0x1 00:08:02.695 04:51:37 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:02.695 04:51:37 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_15.conf 00:08:02.695 04:51:37 -- nvmf/run.sh@29 -- # printf %02d 15 00:08:02.695 04:51:37 -- nvmf/run.sh@29 -- # port=4415 00:08:02.695 04:51:37 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:02.695 04:51:37 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' 00:08:02.695 04:51:37 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4415"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:02.695 04:51:37 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4415' -c /tmp/fuzz_json_15.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 -Z 15 -r /var/tmp/spdk15.sock 00:08:02.695 [2024-11-08 04:51:37.670426] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:02.695 [2024-11-08 04:51:37.670498] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3685450 ] 00:08:02.695 EAL: No free 2048 kB hugepages reported on node 1 00:08:02.954 [2024-11-08 04:51:37.851497] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:02.954 [2024-11-08 04:51:37.915829] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:02.954 [2024-11-08 04:51:37.915966] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:02.954 [2024-11-08 04:51:37.974540] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:02.954 [2024-11-08 04:51:37.990847] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4415 *** 00:08:02.954 INFO: Running with entropic power schedule (0xFF, 100). 00:08:02.954 INFO: Seed: 4220009111 00:08:02.954 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:02.954 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:02.954 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_15 00:08:02.954 INFO: A corpus is not provided, starting from an empty corpus 00:08:02.954 #2 INITED exec/s: 0 rss: 61Mb 00:08:02.954 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:02.954 This may also happen if the target rejected all inputs we tried so far 00:08:02.954 [2024-11-08 04:51:38.062153] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.954 [2024-11-08 04:51:38.062193] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:02.954 [2024-11-08 04:51:38.062340] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.954 [2024-11-08 04:51:38.062362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:02.954 [2024-11-08 04:51:38.062510] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.954 [2024-11-08 04:51:38.062539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:02.954 [2024-11-08 04:51:38.062690] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:02.954 [2024-11-08 04:51:38.062711] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.472 NEW_FUNC[1/669]: 0x44fe48 in fuzz_admin_get_features_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:460 00:08:03.472 NEW_FUNC[2/669]: 0x46fd38 in feat_write_atomicity /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:340 00:08:03.472 #8 NEW cov: 11531 ft: 11532 corp: 2/36b lim: 35 exec/s: 0 rss: 68Mb L: 35/35 MS: 1 InsertRepeatedBytes- 00:08:03.472 [2024-11-08 04:51:38.381958] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.381995] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.382118] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.382135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.472 NEW_FUNC[1/2]: 0xf641d8 in posix_sock_read /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/sock/posix/posix.c:1497 00:08:03.472 NEW_FUNC[2/2]: 0x1df5838 in spdk_pipe_writer_get_buffer /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/util/pipe.c:42 00:08:03.472 #9 NEW cov: 11687 ft: 12760 corp: 3/63b lim: 35 exec/s: 0 rss: 68Mb L: 27/35 MS: 1 CrossOver- 00:08:03.472 [2024-11-08 04:51:38.431489] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.431518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.472 #13 NEW cov: 11693 ft: 13359 corp: 4/71b lim: 35 exec/s: 0 rss: 68Mb L: 8/35 MS: 4 ShuffleBytes-ShuffleBytes-ChangeByte-InsertRepeatedBytes- 00:08:03.472 [2024-11-08 04:51:38.472321] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.472351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.472484] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.472503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.472626] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.472644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.472 #14 NEW cov: 11778 ft: 13662 corp: 5/99b lim: 35 exec/s: 0 rss: 68Mb L: 28/35 MS: 1 InsertByte- 00:08:03.472 [2024-11-08 04:51:38.522535] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.522562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.522696] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.522713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.522841] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.522858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.522984] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.523000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.472 #15 NEW cov: 11778 ft: 13776 corp: 6/133b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 InsertRepeatedBytes- 00:08:03.472 [2024-11-08 04:51:38.572607] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.572636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.572777] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.572795] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.572935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.572952] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.472 [2024-11-08 04:51:38.573076] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.472 [2024-11-08 04:51:38.573097] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.731 #16 NEW cov: 11778 ft: 13851 corp: 7/167b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:03.731 [2024-11-08 04:51:38.622916] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.622942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.731 [2024-11-08 04:51:38.623064] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.623080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.731 [2024-11-08 04:51:38.623210] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.623226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.731 #17 NEW cov: 11778 ft: 13959 corp: 8/198b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 CrossOver- 00:08:03.731 [2024-11-08 04:51:38.663080] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.663105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.731 [2024-11-08 04:51:38.663239] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.663256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.731 [2024-11-08 04:51:38.663413] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.731 [2024-11-08 04:51:38.663432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.731 [2024-11-08 04:51:38.663566] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.663586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.663717] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.663735] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.732 #18 NEW cov: 11778 ft: 14101 corp: 9/233b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:03.732 [2024-11-08 04:51:38.703190] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.703217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.703338] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.703355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.703483] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:0000077a SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.703501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.732 #19 NEW cov: 11778 ft: 14180 corp: 10/261b lim: 35 exec/s: 0 rss: 69Mb L: 28/35 MS: 1 ChangeByte- 00:08:03.732 [2024-11-08 04:51:38.743251] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.743279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.743402] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.743418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.743542] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.743560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.732 #20 NEW cov: 11778 ft: 14281 corp: 11/292b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ChangeBinInt- 00:08:03.732 [2024-11-08 04:51:38.783202] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.783229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.783349] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.783365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.783490] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.783509] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.732 [2024-11-08 04:51:38.783642] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.783661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.732 #21 NEW cov: 11778 ft: 14332 corp: 12/326b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 CopyPart- 00:08:03.732 [2024-11-08 04:51:38.823068] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.732 [2024-11-08 04:51:38.823094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.991 #23 NEW cov: 11778 ft: 14488 corp: 13/340b lim: 35 exec/s: 0 rss: 69Mb L: 14/35 MS: 2 InsertRepeatedBytes-CMP- DE: "\377\377\377\377\376\377\377\377"- 00:08:03.991 [2024-11-08 04:51:38.863432] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.863459] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.863595] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.863613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.863737] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.863755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.863882] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.863902] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.991 #24 NEW cov: 11778 ft: 14510 corp: 14/374b lim: 35 exec/s: 0 rss: 69Mb L: 34/35 MS: 1 ShuffleBytes- 00:08:03.991 [2024-11-08 04:51:38.904075] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.904101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.904228] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.904246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.904369] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.904388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.904515] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.904537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.991 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:03.991 #25 NEW cov: 11801 ft: 14549 corp: 15/409b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 CrossOver- 00:08:03.991 [2024-11-08 04:51:38.944208] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.944235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.944370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.944386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.944511] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.944551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.944701] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.944721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:03.991 #26 NEW cov: 11801 ft: 14592 corp: 16/444b lim: 35 exec/s: 0 rss: 69Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:03.991 [2024-11-08 04:51:38.983963] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.983990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.984131] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.984149] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.984284] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.984301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:03.991 [2024-11-08 04:51:38.984434] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:38.984452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:03.991 #27 NEW cov: 11801 ft: 14625 corp: 17/475b lim: 35 exec/s: 0 rss: 69Mb L: 31/35 MS: 1 ShuffleBytes- 00:08:03.991 [2024-11-08 04:51:39.023422] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:39.023449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.991 #30 NEW cov: 11801 ft: 14649 corp: 18/482b lim: 35 exec/s: 30 rss: 69Mb L: 7/35 MS: 3 CrossOver-InsertByte-InsertByte- 00:08:03.991 [2024-11-08 04:51:39.063541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:03.991 [2024-11-08 04:51:39.063569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:03.991 #31 NEW cov: 11801 ft: 14705 corp: 19/489b lim: 35 exec/s: 31 rss: 69Mb L: 7/35 MS: 1 ChangeBinInt- 00:08:04.251 [2024-11-08 04:51:39.114188] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000015 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.114217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.114357] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.114376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.114501] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.114518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.114652] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.114670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.114797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.114814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.251 #32 NEW cov: 11801 ft: 14780 corp: 20/524b lim: 35 exec/s: 32 rss: 69Mb L: 35/35 MS: 1 CopyPart- 00:08:04.251 [2024-11-08 04:51:39.164755] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.164785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.164914] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.164932] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.165066] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.165084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.165225] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.165246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.165386] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.165403] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.251 #33 NEW cov: 11801 ft: 14797 corp: 21/559b lim: 35 exec/s: 33 rss: 70Mb L: 35/35 MS: 1 CrossOver- 00:08:04.251 [2024-11-08 04:51:39.214383] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000727 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.214411] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.214544] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.214562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.214691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.214708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 #35 NEW cov: 11801 ft: 14818 corp: 22/581b lim: 35 exec/s: 35 rss: 70Mb L: 22/35 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:04.251 [2024-11-08 04:51:39.254899] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.254927] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.255056] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.255072] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.255200] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007ff SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.255217] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.251 #36 NEW cov: 11801 ft: 14855 corp: 23/612b lim: 35 exec/s: 36 rss: 70Mb L: 31/35 MS: 1 PersAutoDict- DE: "\377\377\377\377\376\377\377\377"- 00:08:04.251 [2024-11-08 04:51:39.305093] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.305120] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.305259] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000001f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.305278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.305406] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.305424] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.305563] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.305581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.305716] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.305734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.251 #37 NEW cov: 11801 ft: 14870 corp: 24/647b lim: 35 exec/s: 37 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:04.251 [2024-11-08 04:51:39.355370] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.355399] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.355538] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.355556] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.355688] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.355704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.251 [2024-11-08 04:51:39.355832] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.251 [2024-11-08 04:51:39.355850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #38 NEW cov: 11801 ft: 14882 corp: 25/682b lim: 35 exec/s: 38 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:04.511 [2024-11-08 04:51:39.405464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.405493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.405623] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.405645] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.405779] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.405797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.405935] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.405954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #39 NEW cov: 11801 ft: 14898 corp: 26/717b lim: 35 exec/s: 39 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:04.511 [2024-11-08 04:51:39.445634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.445661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.445797] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.445815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.445938] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.445955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.446084] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.446102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #40 NEW cov: 11801 ft: 14909 corp: 27/752b lim: 35 exec/s: 40 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:04.511 [2024-11-08 04:51:39.485719] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.485745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.485877] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:0000001f SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.485894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.486034] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.486051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.486186] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.486203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.486328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.486349] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #41 NEW cov: 11801 ft: 14950 corp: 28/787b lim: 35 exec/s: 41 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:04.511 [2024-11-08 04:51:39.535890] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.535919] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.536054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.536071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.536209] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.536228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.536362] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:00000100 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.536380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.536516] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.536538] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #42 NEW cov: 11801 ft: 14961 corp: 29/822b lim: 35 exec/s: 42 rss: 70Mb L: 35/35 MS: 1 ChangeByte- 00:08:04.511 [2024-11-08 04:51:39.586117] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.586145] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.586282] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.586300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.586424] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.586441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.511 [2024-11-08 04:51:39.586579] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.511 [2024-11-08 04:51:39.586597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.511 #43 NEW cov: 11801 ft: 14967 corp: 30/857b lim: 35 exec/s: 43 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:04.769 [2024-11-08 04:51:39.636247] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.769 [2024-11-08 04:51:39.636278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.769 [2024-11-08 04:51:39.636418] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.769 [2024-11-08 04:51:39.636438] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.769 [2024-11-08 04:51:39.636560] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.769 [2024-11-08 04:51:39.636580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.769 [2024-11-08 04:51:39.636709] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.769 [2024-11-08 04:51:39.636725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.769 #44 NEW cov: 11801 ft: 14974 corp: 31/892b lim: 35 exec/s: 44 rss: 70Mb L: 35/35 MS: 1 ShuffleBytes- 00:08:04.769 [2024-11-08 04:51:39.685727] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.769 [2024-11-08 04:51:39.685755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.769 #45 NEW cov: 11801 ft: 14980 corp: 32/906b lim: 35 exec/s: 45 rss: 70Mb L: 14/35 MS: 1 ChangeBinInt- 00:08:04.769 [2024-11-08 04:51:39.735598] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.735624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:04.770 #46 NEW cov: 11801 ft: 15005 corp: 33/913b lim: 35 exec/s: 46 rss: 70Mb L: 7/35 MS: 1 CopyPart- 00:08:04.770 [2024-11-08 04:51:39.776634] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.776661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.776784] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.776801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.776923] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.776943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.777063] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.777080] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.770 NEW_FUNC[1/1]: 0x46e6f8 in feat_interrupt_coalescing /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:325 00:08:04.770 #47 NEW cov: 11823 ft: 15030 corp: 34/948b lim: 35 exec/s: 47 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:04.770 [2024-11-08 04:51:39.816663] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.816690] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.816820] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.816841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.816975] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.816993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.817113] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.817132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:04.770 #48 NEW cov: 11823 ft: 15034 corp: 35/983b lim: 35 exec/s: 48 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:04.770 [2024-11-08 04:51:39.856630] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.856657] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.856783] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.856803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:04.770 [2024-11-08 04:51:39.856925] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:04.770 [2024-11-08 04:51:39.856942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:04.770 #49 NEW cov: 11823 ft: 15056 corp: 36/1014b lim: 35 exec/s: 49 rss: 70Mb L: 31/35 MS: 1 EraseBytes- 00:08:05.028 [2024-11-08 04:51:39.896901] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.896928] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.897054] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:00000000 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.897071] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.897182] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:00000700 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.897203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.897328] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.897345] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.897464] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.897482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.028 #50 NEW cov: 11823 ft: 15066 corp: 37/1049b lim: 35 exec/s: 50 rss: 70Mb L: 35/35 MS: 1 ChangeBinInt- 00:08:05.028 [2024-11-08 04:51:39.937098] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:4 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.937125] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.937249] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.937269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.937392] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.937408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.937541] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.937559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.937691] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.937709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.028 #51 NEW cov: 11823 ft: 15072 corp: 38/1084b lim: 35 exec/s: 51 rss: 70Mb L: 35/35 MS: 1 CopyPart- 00:08:05.028 [2024-11-08 04:51:39.977277] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:5 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.977303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:5 cdw0:0 sqhd:0010 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.977436] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:6 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.977454] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:6 cdw0:0 sqhd:0011 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.977588] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:7 cdw10:000007e0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.977608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:7 cdw0:0 sqhd:0012 p:0 m:0 dnr:0 00:08:05.028 [2024-11-08 04:51:39.977734] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES RESERVED cid:8 cdw10:000007f0 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:39.977752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:8 cdw0:0 sqhd:0013 p:0 m:0 dnr:0 00:08:05.028 #52 NEW cov: 11823 ft: 15081 corp: 39/1119b lim: 35 exec/s: 52 rss: 70Mb L: 35/35 MS: 1 ChangeBit- 00:08:05.028 [2024-11-08 04:51:40.016355] nvme_qpair.c: 215:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES TIMESTAMP cid:4 cdw10:0000000e SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.028 [2024-11-08 04:51:40.016385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID FIELD (00/02) qid:0 cid:4 cdw0:0 sqhd:000f p:0 m:0 dnr:0 00:08:05.028 #53 NEW cov: 11823 ft: 15103 corp: 40/1131b lim: 35 exec/s: 26 rss: 70Mb L: 12/35 MS: 1 CopyPart- 00:08:05.028 #53 DONE cov: 11823 ft: 15103 corp: 40/1131b lim: 35 exec/s: 26 rss: 70Mb 00:08:05.028 ###### Recommended dictionary. ###### 00:08:05.029 "\377\377\377\377\376\377\377\377" # Uses: 1 00:08:05.029 ###### End of recommended dictionary. ###### 00:08:05.029 Done 53 runs in 2 second(s) 00:08:05.287 04:51:40 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_15.conf 00:08:05.287 04:51:40 -- ../common.sh@72 -- # (( i++ )) 00:08:05.287 04:51:40 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:05.287 04:51:40 -- ../common.sh@73 -- # start_llvm_fuzz 16 1 0x1 00:08:05.287 04:51:40 -- nvmf/run.sh@23 -- # local fuzzer_type=16 00:08:05.287 04:51:40 -- nvmf/run.sh@24 -- # local timen=1 00:08:05.287 04:51:40 -- nvmf/run.sh@25 -- # local core=0x1 00:08:05.287 04:51:40 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:05.287 04:51:40 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_16.conf 00:08:05.287 04:51:40 -- nvmf/run.sh@29 -- # printf %02d 16 00:08:05.287 04:51:40 -- nvmf/run.sh@29 -- # port=4416 00:08:05.287 04:51:40 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:05.287 04:51:40 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' 00:08:05.287 04:51:40 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4416"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:05.287 04:51:40 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4416' -c /tmp/fuzz_json_16.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 -Z 16 -r /var/tmp/spdk16.sock 00:08:05.287 [2024-11-08 04:51:40.206682] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:05.287 [2024-11-08 04:51:40.206784] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3685987 ] 00:08:05.287 EAL: No free 2048 kB hugepages reported on node 1 00:08:05.287 [2024-11-08 04:51:40.386162] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:05.546 [2024-11-08 04:51:40.452179] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:05.546 [2024-11-08 04:51:40.452306] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.546 [2024-11-08 04:51:40.511142] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:05.546 [2024-11-08 04:51:40.527426] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4416 *** 00:08:05.546 INFO: Running with entropic power schedule (0xFF, 100). 00:08:05.546 INFO: Seed: 2461034754 00:08:05.546 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:05.546 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:05.546 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_16 00:08:05.546 INFO: A corpus is not provided, starting from an empty corpus 00:08:05.546 #2 INITED exec/s: 0 rss: 61Mb 00:08:05.546 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:05.546 This may also happen if the target rejected all inputs we tried so far 00:08:05.546 [2024-11-08 04:51:40.575878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.546 [2024-11-08 04:51:40.575908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.546 [2024-11-08 04:51:40.575961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.546 [2024-11-08 04:51:40.575980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.805 NEW_FUNC[1/671]: 0x451308 in fuzz_nvm_read_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:519 00:08:05.805 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:05.805 #23 NEW cov: 11663 ft: 11661 corp: 2/58b lim: 105 exec/s: 0 rss: 68Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:05.805 [2024-11-08 04:51:40.896887] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.805 [2024-11-08 04:51:40.896920] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:05.805 [2024-11-08 04:51:40.896961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.805 [2024-11-08 04:51:40.896976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:05.805 [2024-11-08 04:51:40.897028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.805 [2024-11-08 04:51:40.897042] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:05.805 [2024-11-08 04:51:40.897092] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:05.805 [2024-11-08 04:51:40.897106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.064 #24 NEW cov: 11776 ft: 12777 corp: 3/144b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 InsertRepeatedBytes- 00:08:06.064 [2024-11-08 04:51:40.946948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.946976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.947011] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.947026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.947077] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3602879701896396800 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.947093] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.947144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.947158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.064 #30 NEW cov: 11782 ft: 12915 corp: 4/230b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeByte- 00:08:06.064 [2024-11-08 04:51:40.987049] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.987078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.987115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.987131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.987187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.987203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:40.987248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:40.987263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.064 #31 NEW cov: 11867 ft: 13229 corp: 5/316b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeBinInt- 00:08:06.064 [2024-11-08 04:51:41.026767] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.026794] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 #38 NEW cov: 11867 ft: 13852 corp: 6/352b lim: 105 exec/s: 0 rss: 68Mb L: 36/86 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:06.064 [2024-11-08 04:51:41.067242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.067269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.067312] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.067327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.067379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:9223372036854775808 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.067393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.067443] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.067457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.064 #39 NEW cov: 11867 ft: 13952 corp: 7/438b lim: 105 exec/s: 0 rss: 68Mb L: 86/86 MS: 1 ChangeBit- 00:08:06.064 [2024-11-08 04:51:41.107012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.107039] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 #40 NEW cov: 11867 ft: 14030 corp: 8/474b lim: 105 exec/s: 0 rss: 68Mb L: 36/86 MS: 1 ShuffleBytes- 00:08:06.064 [2024-11-08 04:51:41.147494] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.147521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.147579] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.147592] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.147657] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3602879701896396800 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.147676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.064 [2024-11-08 04:51:41.147728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.064 [2024-11-08 04:51:41.147742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.324 #41 NEW cov: 11867 ft: 14080 corp: 9/560b lim: 105 exec/s: 0 rss: 69Mb L: 86/86 MS: 1 ChangeBinInt- 00:08:06.324 [2024-11-08 04:51:41.197560] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.197587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.197624] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.197639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.197690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.197704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.324 #42 NEW cov: 11867 ft: 14372 corp: 10/625b lim: 105 exec/s: 0 rss: 69Mb L: 65/86 MS: 1 CopyPart- 00:08:06.324 [2024-11-08 04:51:41.237753] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.237780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.237843] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.237858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.237910] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:382252089344 len:22874 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.237925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.237976] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.237990] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.324 #48 NEW cov: 11867 ft: 14427 corp: 11/718b lim: 105 exec/s: 0 rss: 69Mb L: 93/93 MS: 1 InsertRepeatedBytes- 00:08:06.324 [2024-11-08 04:51:41.277788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.277816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.277852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.277867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.277918] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.277935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.324 #49 NEW cov: 11867 ft: 14462 corp: 12/797b lim: 105 exec/s: 0 rss: 69Mb L: 79/93 MS: 1 InsertRepeatedBytes- 00:08:06.324 [2024-11-08 04:51:41.317648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.317675] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 #50 NEW cov: 11867 ft: 14503 corp: 13/833b lim: 105 exec/s: 0 rss: 69Mb L: 36/93 MS: 1 ChangeBit- 00:08:06.324 [2024-11-08 04:51:41.358136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.358163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.358222] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.358236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.358289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17868022686844715255 len:63233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.358305] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.358367] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.358382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.324 #51 NEW cov: 11867 ft: 14522 corp: 14/919b lim: 105 exec/s: 0 rss: 69Mb L: 86/93 MS: 1 CopyPart- 00:08:06.324 [2024-11-08 04:51:41.397966] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.397993] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.324 [2024-11-08 04:51:41.398028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.324 [2024-11-08 04:51:41.398043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.324 #52 NEW cov: 11867 ft: 14557 corp: 15/976b lim: 105 exec/s: 0 rss: 69Mb L: 57/93 MS: 1 CopyPart- 00:08:06.583 [2024-11-08 04:51:41.438115] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.438143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.583 [2024-11-08 04:51:41.438197] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.438212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.583 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:06.583 #53 NEW cov: 11890 ft: 14592 corp: 16/1033b lim: 105 exec/s: 0 rss: 69Mb L: 57/93 MS: 1 ChangeByte- 00:08:06.583 [2024-11-08 04:51:41.478200] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.478227] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.583 [2024-11-08 04:51:41.478284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.478299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.583 #54 NEW cov: 11890 ft: 14610 corp: 17/1090b lim: 105 exec/s: 0 rss: 69Mb L: 57/93 MS: 1 CopyPart- 00:08:06.583 [2024-11-08 04:51:41.518561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.518587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.583 [2024-11-08 04:51:41.518636] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.518651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.583 [2024-11-08 04:51:41.518704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:382252089344 len:22874 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.518719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.583 [2024-11-08 04:51:41.518770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.583 [2024-11-08 04:51:41.518785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.584 #55 NEW cov: 11890 ft: 14618 corp: 18/1183b lim: 105 exec/s: 0 rss: 69Mb L: 93/93 MS: 1 ChangeBinInt- 00:08:06.584 [2024-11-08 04:51:41.558491] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.558517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.558585] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.558601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.584 #56 NEW cov: 11890 ft: 14626 corp: 19/1240b lim: 105 exec/s: 56 rss: 69Mb L: 57/93 MS: 1 CMP- DE: "\002\000\000\000\000\000\000\000"- 00:08:06.584 [2024-11-08 04:51:41.598693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.598721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.598758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.598773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.598825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.598840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.584 #57 NEW cov: 11890 ft: 14635 corp: 20/1319b lim: 105 exec/s: 57 rss: 69Mb L: 79/93 MS: 1 ShuffleBytes- 00:08:06.584 [2024-11-08 04:51:41.638916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.638947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.638984] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.638998] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.639050] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.639065] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.639116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.639131] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.584 #58 NEW cov: 11890 ft: 14645 corp: 21/1423b lim: 105 exec/s: 58 rss: 69Mb L: 104/104 MS: 1 CopyPart- 00:08:06.584 [2024-11-08 04:51:41.679014] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.679041] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.679086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.679101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.679169] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:382252089344 len:22874 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.679184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.584 [2024-11-08 04:51:41.679236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.584 [2024-11-08 04:51:41.679252] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.843 #59 NEW cov: 11890 ft: 14665 corp: 22/1518b lim: 105 exec/s: 59 rss: 69Mb L: 95/104 MS: 1 CMP- DE: "\000\002"- 00:08:06.843 [2024-11-08 04:51:41.719199] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.843 [2024-11-08 04:51:41.719236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.843 [2024-11-08 04:51:41.719299] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.843 [2024-11-08 04:51:41.719315] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.843 [2024-11-08 04:51:41.719365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.843 [2024-11-08 04:51:41.719380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.719434] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.719449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.844 #60 NEW cov: 11890 ft: 14683 corp: 23/1604b lim: 105 exec/s: 60 rss: 69Mb L: 86/104 MS: 1 CopyPart- 00:08:06.844 [2024-11-08 04:51:41.759242] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.759269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.759313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.759328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.759379] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.759395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.759446] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.759461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.844 #61 NEW cov: 11890 ft: 14701 corp: 24/1708b lim: 105 exec/s: 61 rss: 70Mb L: 104/104 MS: 1 ShuffleBytes- 00:08:06.844 [2024-11-08 04:51:41.799386] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.799414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.799461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.799476] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.799533] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:4160159744 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.799549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.799601] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.799616] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.844 #62 NEW cov: 11890 ft: 14744 corp: 25/1794b lim: 105 exec/s: 62 rss: 70Mb L: 86/104 MS: 1 CopyPart- 00:08:06.844 [2024-11-08 04:51:41.839532] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.839559] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.839606] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.839621] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.839673] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.839688] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.839743] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.839757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.844 #63 NEW cov: 11890 ft: 14797 corp: 26/1880b lim: 105 exec/s: 63 rss: 70Mb L: 86/104 MS: 1 CopyPart- 00:08:06.844 [2024-11-08 04:51:41.879376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.879402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.879453] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:522125824 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.879469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.844 #64 NEW cov: 11890 ft: 14811 corp: 27/1924b lim: 105 exec/s: 64 rss: 70Mb L: 44/104 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:06.844 [2024-11-08 04:51:41.919726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.919753] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.919791] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.919806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.919860] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:15492382718154506240 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.919875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:06.844 [2024-11-08 04:51:41.919928] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:06.844 [2024-11-08 04:51:41.919942] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:06.844 #65 NEW cov: 11890 ft: 14834 corp: 28/2011b lim: 105 exec/s: 65 rss: 70Mb L: 87/104 MS: 1 InsertByte- 00:08:07.103 [2024-11-08 04:51:41.959741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:41.959768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.103 [2024-11-08 04:51:41.959806] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:2242545357980376863 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:41.959820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.103 [2024-11-08 04:51:41.959872] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:41.959887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.103 #66 NEW cov: 11890 ft: 14837 corp: 29/2082b lim: 105 exec/s: 66 rss: 70Mb L: 71/104 MS: 1 InsertRepeatedBytes- 00:08:07.103 [2024-11-08 04:51:41.999935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:41.999973] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.103 [2024-11-08 04:51:42.000028] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:42.000043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.103 [2024-11-08 04:51:42.000094] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:22874 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:42.000109] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.103 [2024-11-08 04:51:42.000160] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.103 [2024-11-08 04:51:42.000174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.104 #67 NEW cov: 11890 ft: 14847 corp: 30/2176b lim: 105 exec/s: 67 rss: 70Mb L: 94/104 MS: 1 InsertByte- 00:08:07.104 [2024-11-08 04:51:42.039970] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.039997] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.040035] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.040050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.040102] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.040117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.104 #68 NEW cov: 11890 ft: 14865 corp: 31/2254b lim: 105 exec/s: 68 rss: 70Mb L: 78/104 MS: 1 InsertRepeatedBytes- 00:08:07.104 [2024-11-08 04:51:42.080187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.080213] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.080276] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.080291] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.080343] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:17868022686844715255 len:63233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.080358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.080410] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.080425] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.104 #69 NEW cov: 11890 ft: 14942 corp: 32/2340b lim: 105 exec/s: 69 rss: 70Mb L: 86/104 MS: 1 ChangeBit- 00:08:07.104 [2024-11-08 04:51:42.120129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.120156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.120212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.120226] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.104 #70 NEW cov: 11890 ft: 15015 corp: 33/2401b lim: 105 exec/s: 70 rss: 70Mb L: 61/104 MS: 1 CMP- DE: "\037\000\000\000"- 00:08:07.104 [2024-11-08 04:51:42.160520] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.160565] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.160625] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069936717599 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.160641] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.160693] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.160707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.160759] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069951455231 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.160773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.160825] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.160840] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:07.104 #71 NEW cov: 11890 ft: 15075 corp: 34/2506b lim: 105 exec/s: 71 rss: 70Mb L: 105/105 MS: 1 InsertRepeatedBytes- 00:08:07.104 [2024-11-08 04:51:42.200517] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.200547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.200597] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63233 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.200612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.200664] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:1507328 len:90 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.200680] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.104 [2024-11-08 04:51:42.200732] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.104 [2024-11-08 04:51:42.200745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.363 #72 NEW cov: 11890 ft: 15117 corp: 35/2603b lim: 105 exec/s: 72 rss: 70Mb L: 97/105 MS: 1 InsertRepeatedBytes- 00:08:07.363 [2024-11-08 04:51:42.240421] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.240451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.240519] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.240539] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.363 #73 NEW cov: 11890 ft: 15129 corp: 36/2660b lim: 105 exec/s: 73 rss: 70Mb L: 57/105 MS: 1 PersAutoDict- DE: "\002\000\000\000\000\000\000\000"- 00:08:07.363 [2024-11-08 04:51:42.280568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.280596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.280646] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.280661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.363 #74 NEW cov: 11890 ft: 15138 corp: 37/2706b lim: 105 exec/s: 74 rss: 70Mb L: 46/105 MS: 1 EraseBytes- 00:08:07.363 [2024-11-08 04:51:42.320549] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:8017 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.320577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.363 #75 NEW cov: 11890 ft: 15198 corp: 38/2742b lim: 105 exec/s: 75 rss: 70Mb L: 36/105 MS: 1 ChangeByte- 00:08:07.363 [2024-11-08 04:51:42.361135] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.361162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.361213] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:18446744069936717599 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.361228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.361280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:55521 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.361293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.361345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:18446744069951455231 len:65536 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.361360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.363 [2024-11-08 04:51:42.361413] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:4 nsid:0 lba:2242545357980376863 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.361428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:07.363 #76 NEW cov: 11890 ft: 15217 corp: 39/2847b lim: 105 exec/s: 76 rss: 70Mb L: 105/105 MS: 1 ChangeBinInt- 00:08:07.363 [2024-11-08 04:51:42.400796] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.363 [2024-11-08 04:51:42.400823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.363 #77 NEW cov: 11890 ft: 15231 corp: 40/2883b lim: 105 exec/s: 77 rss: 70Mb L: 36/105 MS: 1 ChangeByte- 00:08:07.364 [2024-11-08 04:51:42.440911] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:2242545358517247775 len:7968 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.364 [2024-11-08 04:51:42.440938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.364 #78 NEW cov: 11890 ft: 15260 corp: 41/2919b lim: 105 exec/s: 78 rss: 70Mb L: 36/105 MS: 1 CopyPart- 00:08:07.623 [2024-11-08 04:51:42.481032] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.481060] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.623 #79 NEW cov: 11890 ft: 15295 corp: 42/2955b lim: 105 exec/s: 79 rss: 70Mb L: 36/105 MS: 1 EraseBytes- 00:08:07.623 [2024-11-08 04:51:42.521504] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:17868022687012487168 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.521536] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.623 [2024-11-08 04:51:42.521576] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:17868022691004938231 len:63480 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.521591] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.623 [2024-11-08 04:51:42.521658] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:2 nsid:0 lba:3602879701896396800 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.521673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:07.623 [2024-11-08 04:51:42.521726] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:3 nsid:0 lba:0 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.521742] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:07.623 #80 NEW cov: 11890 ft: 15302 corp: 43/3041b lim: 105 exec/s: 80 rss: 70Mb L: 86/105 MS: 1 CrossOver- 00:08:07.623 [2024-11-08 04:51:42.561353] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:0 nsid:0 lba:167772160 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.561380] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:07.623 [2024-11-08 04:51:42.561417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: READ sqid:1 cid:1 nsid:0 lba:11708975778346565770 len:1 SGL TRANSPORT DATA BLOCK TRANSPORT 0x0 00:08:07.623 [2024-11-08 04:51:42.561432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:07.623 #81 NEW cov: 11890 ft: 15312 corp: 44/3098b lim: 105 exec/s: 40 rss: 70Mb L: 57/105 MS: 1 CMP- DE: "\212\242~\243o\226\203\000"- 00:08:07.623 #81 DONE cov: 11890 ft: 15312 corp: 44/3098b lim: 105 exec/s: 40 rss: 70Mb 00:08:07.623 ###### Recommended dictionary. ###### 00:08:07.623 "\002\000\000\000\000\000\000\000" # Uses: 2 00:08:07.623 "\000\002" # Uses: 0 00:08:07.623 "\037\000\000\000" # Uses: 0 00:08:07.623 "\212\242~\243o\226\203\000" # Uses: 0 00:08:07.623 ###### End of recommended dictionary. ###### 00:08:07.623 Done 81 runs in 2 second(s) 00:08:07.623 04:51:42 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_16.conf 00:08:07.623 04:51:42 -- ../common.sh@72 -- # (( i++ )) 00:08:07.623 04:51:42 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:07.623 04:51:42 -- ../common.sh@73 -- # start_llvm_fuzz 17 1 0x1 00:08:07.623 04:51:42 -- nvmf/run.sh@23 -- # local fuzzer_type=17 00:08:07.623 04:51:42 -- nvmf/run.sh@24 -- # local timen=1 00:08:07.623 04:51:42 -- nvmf/run.sh@25 -- # local core=0x1 00:08:07.623 04:51:42 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:07.623 04:51:42 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_17.conf 00:08:07.623 04:51:42 -- nvmf/run.sh@29 -- # printf %02d 17 00:08:07.623 04:51:42 -- nvmf/run.sh@29 -- # port=4417 00:08:07.623 04:51:42 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:07.623 04:51:42 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' 00:08:07.623 04:51:42 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4417"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:07.623 04:51:42 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4417' -c /tmp/fuzz_json_17.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 -Z 17 -r /var/tmp/spdk17.sock 00:08:07.623 [2024-11-08 04:51:42.730631] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:07.623 [2024-11-08 04:51:42.730691] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3686519 ] 00:08:07.882 EAL: No free 2048 kB hugepages reported on node 1 00:08:07.882 [2024-11-08 04:51:42.904010] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:07.882 [2024-11-08 04:51:42.967351] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:07.882 [2024-11-08 04:51:42.967497] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:08.141 [2024-11-08 04:51:43.026118] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:08.141 [2024-11-08 04:51:43.042424] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4417 *** 00:08:08.141 INFO: Running with entropic power schedule (0xFF, 100). 00:08:08.141 INFO: Seed: 683080965 00:08:08.141 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:08.141 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:08.141 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_17 00:08:08.141 INFO: A corpus is not provided, starting from an empty corpus 00:08:08.141 #2 INITED exec/s: 0 rss: 60Mb 00:08:08.141 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:08.141 This may also happen if the target rejected all inputs we tried so far 00:08:08.141 [2024-11-08 04:51:43.098053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.141 [2024-11-08 04:51:43.098083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.141 [2024-11-08 04:51:43.098120] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.141 [2024-11-08 04:51:43.098136] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.141 [2024-11-08 04:51:43.098188] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.141 [2024-11-08 04:51:43.098204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.141 [2024-11-08 04:51:43.098258] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.141 [2024-11-08 04:51:43.098273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.400 NEW_FUNC[1/671]: 0x4545f8 in fuzz_nvm_write_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:540 00:08:08.400 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:08.400 #6 NEW cov: 11678 ft: 11680 corp: 2/99b lim: 120 exec/s: 0 rss: 68Mb L: 98/98 MS: 4 ChangeBit-CrossOver-EraseBytes-InsertRepeatedBytes- 00:08:08.400 [2024-11-08 04:51:43.398750] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.398785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.398830] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.398846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.398899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.398915] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.398967] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.398983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.400 NEW_FUNC[1/1]: 0x1c5f608 in spdk_thread_get_last_tsc /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1312 00:08:08.400 #12 NEW cov: 11797 ft: 12232 corp: 3/197b lim: 120 exec/s: 0 rss: 68Mb L: 98/98 MS: 1 ShuffleBytes- 00:08:08.400 [2024-11-08 04:51:43.448828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.448858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.448899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.448916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.448969] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.448984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.449034] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.449050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.400 #14 NEW cov: 11803 ft: 12482 corp: 4/310b lim: 120 exec/s: 0 rss: 68Mb L: 113/113 MS: 2 InsertByte-InsertRepeatedBytes- 00:08:08.400 [2024-11-08 04:51:43.488885] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.488912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.488962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.488977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.489031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.400 [2024-11-08 04:51:43.489051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.400 [2024-11-08 04:51:43.489103] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.401 [2024-11-08 04:51:43.489119] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #15 NEW cov: 11888 ft: 12841 corp: 5/423b lim: 120 exec/s: 0 rss: 68Mb L: 113/113 MS: 1 CopyPart- 00:08:08.660 [2024-11-08 04:51:43.529071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.529099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.529147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.529164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.529217] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.529233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.529287] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.529303] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #16 NEW cov: 11888 ft: 12926 corp: 6/521b lim: 120 exec/s: 0 rss: 68Mb L: 98/113 MS: 1 ChangeBit- 00:08:08.660 [2024-11-08 04:51:43.569170] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.569197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.569238] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.569254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.569307] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.569323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.569376] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.569390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #17 NEW cov: 11888 ft: 12996 corp: 7/623b lim: 120 exec/s: 0 rss: 68Mb L: 102/113 MS: 1 CMP- DE: "\002\000\000\000"- 00:08:08.660 [2024-11-08 04:51:43.609252] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.609278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.609342] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.609359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.609414] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.609430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.609482] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.609499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #18 NEW cov: 11888 ft: 13166 corp: 8/741b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 CrossOver- 00:08:08.660 [2024-11-08 04:51:43.649388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.649417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.649466] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.649483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.649538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.649555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.649608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.649625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #19 NEW cov: 11888 ft: 13185 corp: 9/859b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 PersAutoDict- DE: "\002\000\000\000"- 00:08:08.660 [2024-11-08 04:51:43.689485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.689513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.689568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.689584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.689655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.689672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.689725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.689741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #20 NEW cov: 11888 ft: 13263 corp: 10/973b lim: 120 exec/s: 0 rss: 68Mb L: 114/118 MS: 1 CrossOver- 00:08:08.660 [2024-11-08 04:51:43.729640] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.729668] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.729711] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:288230376151711744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.729731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.729784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.729799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.660 [2024-11-08 04:51:43.729852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.660 [2024-11-08 04:51:43.729867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.660 #21 NEW cov: 11888 ft: 13338 corp: 11/1073b lim: 120 exec/s: 0 rss: 68Mb L: 100/118 MS: 1 CMP- DE: "\004\000"- 00:08:08.920 [2024-11-08 04:51:43.769630] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.769658] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.769692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.769708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.769762] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.769777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.920 #22 NEW cov: 11888 ft: 13771 corp: 12/1161b lim: 120 exec/s: 0 rss: 68Mb L: 88/118 MS: 1 EraseBytes- 00:08:08.920 [2024-11-08 04:51:43.809867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.809895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.809935] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.809949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.810002] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.810019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.810074] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:18446744069414649855 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.810090] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.920 #23 NEW cov: 11888 ft: 13806 corp: 13/1279b lim: 120 exec/s: 0 rss: 68Mb L: 118/118 MS: 1 InsertRepeatedBytes- 00:08:08.920 [2024-11-08 04:51:43.850019] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.850047] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.850087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.850107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.850161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.850177] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.850233] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.850249] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.920 #24 NEW cov: 11888 ft: 13830 corp: 14/1393b lim: 120 exec/s: 0 rss: 69Mb L: 114/118 MS: 1 ChangeByte- 00:08:08.920 [2024-11-08 04:51:43.890057] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.890084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.890134] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.890150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.890204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.890219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.890272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.890287] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.920 #25 NEW cov: 11888 ft: 13878 corp: 15/1491b lim: 120 exec/s: 0 rss: 69Mb L: 98/118 MS: 1 ShuffleBytes- 00:08:08.920 [2024-11-08 04:51:43.930214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.930241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.930303] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.930318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.930371] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.930388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.930441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.930457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:08.920 #26 NEW cov: 11888 ft: 13885 corp: 16/1593b lim: 120 exec/s: 0 rss: 69Mb L: 102/118 MS: 1 ChangeBinInt- 00:08:08.920 [2024-11-08 04:51:43.970010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:12442509728920939692 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.970037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.920 [2024-11-08 04:51:43.970096] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:12442509728149187756 len:44205 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.920 [2024-11-08 04:51:43.970112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.921 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:08.921 #28 NEW cov: 11911 ft: 14267 corp: 17/1651b lim: 120 exec/s: 0 rss: 69Mb L: 58/118 MS: 2 ChangeByte-InsertRepeatedBytes- 00:08:08.921 [2024-11-08 04:51:44.010298] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.921 [2024-11-08 04:51:44.010325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:08.921 [2024-11-08 04:51:44.010362] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.921 [2024-11-08 04:51:44.010378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:08.921 [2024-11-08 04:51:44.010432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:08.921 [2024-11-08 04:51:44.010448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 #29 NEW cov: 11911 ft: 14320 corp: 18/1739b lim: 120 exec/s: 0 rss: 69Mb L: 88/118 MS: 1 ShuffleBytes- 00:08:09.180 [2024-11-08 04:51:44.050595] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.050624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.050682] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:288230376151711744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.050699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.050752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.050769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.050823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.050841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.180 #30 NEW cov: 11911 ft: 14326 corp: 19/1839b lim: 120 exec/s: 30 rss: 69Mb L: 100/118 MS: 1 ChangeBit- 00:08:09.180 [2024-11-08 04:51:44.090746] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.090774] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.090823] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.090838] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.090891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.090907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.090965] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.090981] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.180 #31 NEW cov: 11911 ft: 14343 corp: 20/1941b lim: 120 exec/s: 31 rss: 69Mb L: 102/118 MS: 1 ChangeBinInt- 00:08:09.180 [2024-11-08 04:51:44.130841] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.130868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.130931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.130948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.131000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.131015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.131068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.131084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.180 #32 NEW cov: 11911 ft: 14359 corp: 21/2054b lim: 120 exec/s: 32 rss: 69Mb L: 113/118 MS: 1 ChangeBit- 00:08:09.180 [2024-11-08 04:51:44.170949] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.170976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.171013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.171029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.171083] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.171098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.171152] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.171169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.180 #33 NEW cov: 11911 ft: 14364 corp: 22/2167b lim: 120 exec/s: 33 rss: 69Mb L: 113/118 MS: 1 ChangeByte- 00:08:09.180 [2024-11-08 04:51:44.211086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.211114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.211157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7217 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.211172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.211230] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.211246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.211301] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.211318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.180 #34 NEW cov: 11911 ft: 14385 corp: 23/2281b lim: 120 exec/s: 34 rss: 69Mb L: 114/118 MS: 1 CopyPart- 00:08:09.180 [2024-11-08 04:51:44.251245] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.251273] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.251328] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7217 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.251346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.251399] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.251414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.180 [2024-11-08 04:51:44.251467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.180 [2024-11-08 04:51:44.251482] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.181 #35 NEW cov: 11911 ft: 14394 corp: 24/2395b lim: 120 exec/s: 35 rss: 69Mb L: 114/118 MS: 1 ChangeBinInt- 00:08:09.440 [2024-11-08 04:51:44.291333] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.291361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.291411] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.291427] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.291480] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.291497] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.291556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.291573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.440 #36 NEW cov: 11911 ft: 14405 corp: 25/2497b lim: 120 exec/s: 36 rss: 69Mb L: 102/118 MS: 1 CopyPart- 00:08:09.440 [2024-11-08 04:51:44.331430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.331458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.331497] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.331513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.331584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.331603] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.331669] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.331685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.440 #37 NEW cov: 11911 ft: 14428 corp: 26/2597b lim: 120 exec/s: 37 rss: 69Mb L: 100/118 MS: 1 EraseBytes- 00:08:09.440 [2024-11-08 04:51:44.371571] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.371599] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.371648] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:288230376151711744 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.371664] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.371728] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.371743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.371797] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.371814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.440 #38 NEW cov: 11911 ft: 14436 corp: 27/2697b lim: 120 exec/s: 38 rss: 70Mb L: 100/118 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:09.440 [2024-11-08 04:51:44.411337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.411363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.411402] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.411416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 #39 NEW cov: 11911 ft: 14440 corp: 28/2751b lim: 120 exec/s: 39 rss: 70Mb L: 54/118 MS: 1 EraseBytes- 00:08:09.440 [2024-11-08 04:51:44.451813] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.451841] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.451880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.451896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.451950] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.451969] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.452020] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.452036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.440 #40 NEW cov: 11911 ft: 14474 corp: 29/2853b lim: 120 exec/s: 40 rss: 70Mb L: 102/118 MS: 1 CopyPart- 00:08:09.440 [2024-11-08 04:51:44.491899] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.491926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.491988] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.492004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.492058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.492074] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.492129] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.492146] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.440 #41 NEW cov: 11911 ft: 14483 corp: 30/2966b lim: 120 exec/s: 41 rss: 70Mb L: 113/118 MS: 1 ChangeByte- 00:08:09.440 [2024-11-08 04:51:44.532031] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.532059] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.532107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.532123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.532176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.532192] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.440 [2024-11-08 04:51:44.532246] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.440 [2024-11-08 04:51:44.532262] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.699 #42 NEW cov: 11911 ft: 14491 corp: 31/3075b lim: 120 exec/s: 42 rss: 70Mb L: 109/118 MS: 1 CopyPart- 00:08:09.699 [2024-11-08 04:51:44.572116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.572143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.572193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.572210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.572266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.572282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.572337] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.572351] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #43 NEW cov: 11911 ft: 14584 corp: 32/3193b lim: 120 exec/s: 43 rss: 70Mb L: 118/118 MS: 1 ChangeBit- 00:08:09.700 [2024-11-08 04:51:44.612214] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.612240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.612305] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.612322] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.612373] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.612387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.612441] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.612457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #44 NEW cov: 11911 ft: 14594 corp: 33/3293b lim: 120 exec/s: 44 rss: 70Mb L: 100/118 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:09.700 [2024-11-08 04:51:44.652316] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.652343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.652393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.652410] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.652464] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.652480] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.652538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:1809352057352363036 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.652555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #45 NEW cov: 11911 ft: 14664 corp: 34/3406b lim: 120 exec/s: 45 rss: 70Mb L: 113/118 MS: 1 ChangeBinInt- 00:08:09.700 [2024-11-08 04:51:44.692461] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.692489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.692534] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.692551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.692621] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.692636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.692692] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.692707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #46 NEW cov: 11911 ft: 14701 corp: 35/3522b lim: 120 exec/s: 46 rss: 70Mb L: 116/118 MS: 1 CopyPart- 00:08:09.700 [2024-11-08 04:51:44.732623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.732651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.732688] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.732705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.732760] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.732776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.732831] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.732847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #47 NEW cov: 11911 ft: 14713 corp: 36/3626b lim: 120 exec/s: 47 rss: 70Mb L: 104/118 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:09.700 [2024-11-08 04:51:44.772741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.772769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.772819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.772836] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.772891] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.772906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.700 [2024-11-08 04:51:44.772960] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.700 [2024-11-08 04:51:44.772992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.700 #48 NEW cov: 11911 ft: 14725 corp: 37/3740b lim: 120 exec/s: 48 rss: 70Mb L: 114/118 MS: 1 ChangeByte- 00:08:09.960 [2024-11-08 04:51:44.812662] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.812689] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.812729] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.812746] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.812802] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:17179869184 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.812817] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 #49 NEW cov: 11911 ft: 14756 corp: 38/3830b lim: 120 exec/s: 49 rss: 70Mb L: 90/118 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:09.960 [2024-11-08 04:51:44.852977] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.853005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.853053] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.853068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.853122] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.853138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.853192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466140672 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.853208] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.960 #50 NEW cov: 11911 ft: 14778 corp: 39/3945b lim: 120 exec/s: 50 rss: 70Mb L: 115/118 MS: 1 PersAutoDict- DE: "\004\000"- 00:08:09.960 [2024-11-08 04:51:44.893040] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.893066] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.893116] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.893132] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.893187] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.893203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.893255] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.893271] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.960 #51 NEW cov: 11911 ft: 14847 corp: 40/4058b lim: 120 exec/s: 51 rss: 70Mb L: 113/118 MS: 1 ChangeBit- 00:08:09.960 [2024-11-08 04:51:44.933137] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:144115188512063488 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.933163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.933212] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.933229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.933284] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.933300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.933354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.933370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.960 #52 NEW cov: 11911 ft: 14879 corp: 41/4160b lim: 120 exec/s: 52 rss: 70Mb L: 102/118 MS: 1 ChangeByte- 00:08:09.960 [2024-11-08 04:51:44.973269] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926428 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.973297] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.973347] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.973363] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.973417] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.973433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:44.973486] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025498330927995932 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:44.973503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.960 #53 NEW cov: 11911 ft: 14942 corp: 42/4279b lim: 120 exec/s: 53 rss: 70Mb L: 119/119 MS: 1 InsertRepeatedBytes- 00:08:09.960 [2024-11-08 04:51:45.013430] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.013457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.013498] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.013513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.013570] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.013586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.013641] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:2305 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.013660] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.053537] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:436207616 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.053564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.053613] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.960 [2024-11-08 04:51:45.053629] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:09.960 [2024-11-08 04:51:45.053683] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:0 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.961 [2024-11-08 04:51:45.053699] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:09.961 [2024-11-08 04:51:45.053752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:0 len:2305 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:09.961 [2024-11-08 04:51:45.053784] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.220 #55 NEW cov: 11911 ft: 14955 corp: 43/4384b lim: 120 exec/s: 55 rss: 70Mb L: 105/119 MS: 2 ChangeBinInt-InsertByte- 00:08:10.220 [2024-11-08 04:51:45.093665] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:0 nsid:0 lba:2025524839165926412 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.220 [2024-11-08 04:51:45.093692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.220 [2024-11-08 04:51:45.093741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:1 nsid:0 lba:2025524839466146844 len:7217 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.220 [2024-11-08 04:51:45.093757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.220 [2024-11-08 04:51:45.093810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:2 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.220 [2024-11-08 04:51:45.093826] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.220 [2024-11-08 04:51:45.093880] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE sqid:1 cid:3 nsid:0 lba:2025524839466146844 len:7197 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:10.220 [2024-11-08 04:51:45.093897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.220 #56 NEW cov: 11911 ft: 14974 corp: 44/4498b lim: 120 exec/s: 28 rss: 70Mb L: 114/119 MS: 1 ChangeBit- 00:08:10.220 #56 DONE cov: 11911 ft: 14974 corp: 44/4498b lim: 120 exec/s: 28 rss: 70Mb 00:08:10.220 ###### Recommended dictionary. ###### 00:08:10.220 "\002\000\000\000" # Uses: 1 00:08:10.220 "\004\000" # Uses: 5 00:08:10.220 ###### End of recommended dictionary. ###### 00:08:10.220 Done 56 runs in 2 second(s) 00:08:10.220 04:51:45 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_17.conf 00:08:10.220 04:51:45 -- ../common.sh@72 -- # (( i++ )) 00:08:10.220 04:51:45 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:10.220 04:51:45 -- ../common.sh@73 -- # start_llvm_fuzz 18 1 0x1 00:08:10.220 04:51:45 -- nvmf/run.sh@23 -- # local fuzzer_type=18 00:08:10.220 04:51:45 -- nvmf/run.sh@24 -- # local timen=1 00:08:10.220 04:51:45 -- nvmf/run.sh@25 -- # local core=0x1 00:08:10.220 04:51:45 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:10.220 04:51:45 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_18.conf 00:08:10.220 04:51:45 -- nvmf/run.sh@29 -- # printf %02d 18 00:08:10.220 04:51:45 -- nvmf/run.sh@29 -- # port=4418 00:08:10.220 04:51:45 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:10.220 04:51:45 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' 00:08:10.220 04:51:45 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4418"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:10.220 04:51:45 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4418' -c /tmp/fuzz_json_18.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 -Z 18 -r /var/tmp/spdk18.sock 00:08:10.220 [2024-11-08 04:51:45.284391] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:10.220 [2024-11-08 04:51:45.284460] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3686840 ] 00:08:10.220 EAL: No free 2048 kB hugepages reported on node 1 00:08:10.480 [2024-11-08 04:51:45.459637] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.480 [2024-11-08 04:51:45.524024] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:10.480 [2024-11-08 04:51:45.524152] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.480 [2024-11-08 04:51:45.582372] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:10.738 [2024-11-08 04:51:45.598698] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4418 *** 00:08:10.738 INFO: Running with entropic power schedule (0xFF, 100). 00:08:10.738 INFO: Seed: 3239091488 00:08:10.738 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:10.738 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:10.738 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_18 00:08:10.738 INFO: A corpus is not provided, starting from an empty corpus 00:08:10.738 #2 INITED exec/s: 0 rss: 60Mb 00:08:10.738 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:10.738 This may also happen if the target rejected all inputs we tried so far 00:08:10.738 [2024-11-08 04:51:45.664958] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.738 [2024-11-08 04:51:45.664999] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.738 [2024-11-08 04:51:45.665095] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.738 [2024-11-08 04:51:45.665114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.738 [2024-11-08 04:51:45.665231] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.738 [2024-11-08 04:51:45.665250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.738 [2024-11-08 04:51:45.665366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:10.738 [2024-11-08 04:51:45.665389] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:10.997 NEW_FUNC[1/670]: 0x457e58 in fuzz_nvm_write_zeroes_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:562 00:08:10.997 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:10.997 #8 NEW cov: 11628 ft: 11619 corp: 2/100b lim: 100 exec/s: 0 rss: 68Mb L: 99/99 MS: 1 InsertRepeatedBytes- 00:08:10.997 [2024-11-08 04:51:45.995813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.997 [2024-11-08 04:51:45.995867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:45.996013] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.997 [2024-11-08 04:51:45.996043] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:45.996170] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.997 [2024-11-08 04:51:45.996197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.997 #9 NEW cov: 11741 ft: 12520 corp: 3/165b lim: 100 exec/s: 0 rss: 68Mb L: 65/99 MS: 1 EraseBytes- 00:08:10.997 [2024-11-08 04:51:46.055802] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.997 [2024-11-08 04:51:46.055837] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:46.055959] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.997 [2024-11-08 04:51:46.055980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:46.056100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.997 [2024-11-08 04:51:46.056123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:10.997 #10 NEW cov: 11747 ft: 12717 corp: 4/239b lim: 100 exec/s: 0 rss: 68Mb L: 74/99 MS: 1 EraseBytes- 00:08:10.997 [2024-11-08 04:51:46.106018] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:10.997 [2024-11-08 04:51:46.106050] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:46.106157] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:10.997 [2024-11-08 04:51:46.106178] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:10.997 [2024-11-08 04:51:46.106289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:10.997 [2024-11-08 04:51:46.106311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.256 #11 NEW cov: 11832 ft: 12999 corp: 5/304b lim: 100 exec/s: 0 rss: 68Mb L: 65/99 MS: 1 ShuffleBytes- 00:08:11.256 [2024-11-08 04:51:46.166528] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.256 [2024-11-08 04:51:46.166558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.256 [2024-11-08 04:51:46.166655] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.256 [2024-11-08 04:51:46.166677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.256 [2024-11-08 04:51:46.166791] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.256 [2024-11-08 04:51:46.166811] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.256 [2024-11-08 04:51:46.166928] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.256 [2024-11-08 04:51:46.166949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.256 [2024-11-08 04:51:46.167063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:11.257 [2024-11-08 04:51:46.167084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.257 #12 NEW cov: 11832 ft: 13164 corp: 6/404b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:11.257 [2024-11-08 04:51:46.206367] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.257 [2024-11-08 04:51:46.206396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.206500] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.257 [2024-11-08 04:51:46.206521] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.206640] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.257 [2024-11-08 04:51:46.206661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.257 #13 NEW cov: 11832 ft: 13311 corp: 7/469b lim: 100 exec/s: 0 rss: 68Mb L: 65/100 MS: 1 ShuffleBytes- 00:08:11.257 [2024-11-08 04:51:46.246067] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.257 [2024-11-08 04:51:46.246096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.246209] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.257 [2024-11-08 04:51:46.246231] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.246350] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.257 [2024-11-08 04:51:46.246374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.246495] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.257 [2024-11-08 04:51:46.246517] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.257 #14 NEW cov: 11832 ft: 13410 corp: 8/568b lim: 100 exec/s: 0 rss: 68Mb L: 99/100 MS: 1 CMP- DE: "\003\000\000\000"- 00:08:11.257 [2024-11-08 04:51:46.286238] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.257 [2024-11-08 04:51:46.286268] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.286358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.257 [2024-11-08 04:51:46.286378] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.286497] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.257 [2024-11-08 04:51:46.286520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.286642] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.257 [2024-11-08 04:51:46.286662] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.257 #15 NEW cov: 11832 ft: 13433 corp: 9/657b lim: 100 exec/s: 0 rss: 68Mb L: 89/100 MS: 1 InsertRepeatedBytes- 00:08:11.257 [2024-11-08 04:51:46.326077] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.257 [2024-11-08 04:51:46.326106] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.257 [2024-11-08 04:51:46.326218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.257 [2024-11-08 04:51:46.326241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.257 #16 NEW cov: 11832 ft: 13728 corp: 10/709b lim: 100 exec/s: 0 rss: 68Mb L: 52/100 MS: 1 EraseBytes- 00:08:11.516 [2024-11-08 04:51:46.366992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.516 [2024-11-08 04:51:46.367021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.367130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.516 [2024-11-08 04:51:46.367153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.367265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.516 [2024-11-08 04:51:46.367286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.367406] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.516 [2024-11-08 04:51:46.367426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.516 #17 NEW cov: 11832 ft: 13755 corp: 11/808b lim: 100 exec/s: 0 rss: 68Mb L: 99/100 MS: 1 ChangeByte- 00:08:11.516 [2024-11-08 04:51:46.407075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.516 [2024-11-08 04:51:46.407104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.407197] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.516 [2024-11-08 04:51:46.407222] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.407332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.516 [2024-11-08 04:51:46.407355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.407467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.516 [2024-11-08 04:51:46.407489] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.407612] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:11.516 [2024-11-08 04:51:46.407635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.516 #18 NEW cov: 11832 ft: 13786 corp: 12/908b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CrossOver- 00:08:11.516 [2024-11-08 04:51:46.447121] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.516 [2024-11-08 04:51:46.447150] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.516 [2024-11-08 04:51:46.447245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.516 [2024-11-08 04:51:46.447263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.447379] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.517 [2024-11-08 04:51:46.447400] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.447515] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.517 [2024-11-08 04:51:46.447540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.517 #19 NEW cov: 11832 ft: 13816 corp: 13/1007b lim: 100 exec/s: 0 rss: 69Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:11.517 [2024-11-08 04:51:46.487484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.517 [2024-11-08 04:51:46.487514] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.487594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.517 [2024-11-08 04:51:46.487614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.487732] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.517 [2024-11-08 04:51:46.487754] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.487866] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.517 [2024-11-08 04:51:46.487887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.487971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:11.517 [2024-11-08 04:51:46.487991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.517 #20 NEW cov: 11832 ft: 13828 corp: 14/1107b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 ChangeBit- 00:08:11.517 [2024-11-08 04:51:46.537606] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.517 [2024-11-08 04:51:46.537636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.537740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.517 [2024-11-08 04:51:46.537761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.537879] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.517 [2024-11-08 04:51:46.537897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.538007] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.517 [2024-11-08 04:51:46.538026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.538139] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:11.517 [2024-11-08 04:51:46.538164] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:11.517 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:11.517 #21 NEW cov: 11855 ft: 13891 corp: 15/1207b lim: 100 exec/s: 0 rss: 69Mb L: 100/100 MS: 1 CrossOver- 00:08:11.517 [2024-11-08 04:51:46.587465] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.517 [2024-11-08 04:51:46.587493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.587605] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.517 [2024-11-08 04:51:46.587625] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.517 [2024-11-08 04:51:46.587740] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.517 [2024-11-08 04:51:46.587762] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.517 #22 NEW cov: 11855 ft: 13901 corp: 16/1272b lim: 100 exec/s: 0 rss: 69Mb L: 65/100 MS: 1 ChangeBinInt- 00:08:11.776 [2024-11-08 04:51:46.627826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.776 [2024-11-08 04:51:46.627858] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.627938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.776 [2024-11-08 04:51:46.627959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.628076] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.776 [2024-11-08 04:51:46.628095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.628217] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.776 [2024-11-08 04:51:46.628240] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.776 #23 NEW cov: 11855 ft: 13927 corp: 17/1371b lim: 100 exec/s: 23 rss: 69Mb L: 99/100 MS: 1 CrossOver- 00:08:11.776 [2024-11-08 04:51:46.677573] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.776 [2024-11-08 04:51:46.677602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.677712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.776 [2024-11-08 04:51:46.677734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.776 #27 NEW cov: 11855 ft: 13943 corp: 18/1429b lim: 100 exec/s: 27 rss: 69Mb L: 58/100 MS: 4 ShuffleBytes-InsertByte-ChangeBinInt-InsertRepeatedBytes- 00:08:11.776 [2024-11-08 04:51:46.728037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.776 [2024-11-08 04:51:46.728070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.728162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.776 [2024-11-08 04:51:46.728181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.728289] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.776 [2024-11-08 04:51:46.728311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.728423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.776 [2024-11-08 04:51:46.728442] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.776 #28 NEW cov: 11855 ft: 14051 corp: 19/1523b lim: 100 exec/s: 28 rss: 69Mb L: 94/100 MS: 1 InsertRepeatedBytes- 00:08:11.776 [2024-11-08 04:51:46.777906] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.776 [2024-11-08 04:51:46.777937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.776 [2024-11-08 04:51:46.778064] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.777 [2024-11-08 04:51:46.778083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.777 #29 NEW cov: 11855 ft: 14097 corp: 20/1581b lim: 100 exec/s: 29 rss: 69Mb L: 58/100 MS: 1 ChangeBit- 00:08:11.777 [2024-11-08 04:51:46.838529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.777 [2024-11-08 04:51:46.838560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.838646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.777 [2024-11-08 04:51:46.838676] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.838787] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.777 [2024-11-08 04:51:46.838812] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.838933] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.777 [2024-11-08 04:51:46.838955] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.777 #31 NEW cov: 11855 ft: 14122 corp: 21/1667b lim: 100 exec/s: 31 rss: 69Mb L: 86/100 MS: 2 ChangeBit-CrossOver- 00:08:11.777 [2024-11-08 04:51:46.878318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:11.777 [2024-11-08 04:51:46.878346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.878436] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:11.777 [2024-11-08 04:51:46.878457] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.878565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:11.777 [2024-11-08 04:51:46.878587] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.878694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:11.777 [2024-11-08 04:51:46.878717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:11.777 [2024-11-08 04:51:46.878843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:11.777 [2024-11-08 04:51:46.878866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.036 #32 NEW cov: 11855 ft: 14136 corp: 22/1767b lim: 100 exec/s: 32 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:12.036 [2024-11-08 04:51:46.928332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.036 [2024-11-08 04:51:46.928364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.036 [2024-11-08 04:51:46.928462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.036 [2024-11-08 04:51:46.928484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.036 #33 NEW cov: 11855 ft: 14151 corp: 23/1822b lim: 100 exec/s: 33 rss: 69Mb L: 55/100 MS: 1 EraseBytes- 00:08:12.036 [2024-11-08 04:51:46.979089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.037 [2024-11-08 04:51:46.979121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:46.979211] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.037 [2024-11-08 04:51:46.979233] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:46.979351] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.037 [2024-11-08 04:51:46.979374] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:46.979485] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.037 [2024-11-08 04:51:46.979507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:46.979625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:12.037 [2024-11-08 04:51:46.979646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.037 #34 NEW cov: 11855 ft: 14205 corp: 24/1922b lim: 100 exec/s: 34 rss: 69Mb L: 100/100 MS: 1 CopyPart- 00:08:12.037 [2024-11-08 04:51:47.018529] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.037 [2024-11-08 04:51:47.018560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.018664] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.037 [2024-11-08 04:51:47.018687] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.018804] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.037 [2024-11-08 04:51:47.018825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.018939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.037 [2024-11-08 04:51:47.018959] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.037 #35 NEW cov: 11855 ft: 14237 corp: 25/2021b lim: 100 exec/s: 35 rss: 69Mb L: 99/100 MS: 1 ShuffleBytes- 00:08:12.037 [2024-11-08 04:51:47.079213] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.037 [2024-11-08 04:51:47.079242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.079340] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.037 [2024-11-08 04:51:47.079364] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.079483] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.037 [2024-11-08 04:51:47.079507] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.079631] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.037 [2024-11-08 04:51:47.079656] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.037 #36 NEW cov: 11855 ft: 14311 corp: 26/2110b lim: 100 exec/s: 36 rss: 69Mb L: 89/100 MS: 1 CrossOver- 00:08:12.037 [2024-11-08 04:51:47.139245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.037 [2024-11-08 04:51:47.139274] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.139390] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.037 [2024-11-08 04:51:47.139414] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.037 [2024-11-08 04:51:47.139544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.037 [2024-11-08 04:51:47.139572] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.296 #37 NEW cov: 11855 ft: 14338 corp: 27/2175b lim: 100 exec/s: 37 rss: 69Mb L: 65/100 MS: 1 ChangeBinInt- 00:08:12.297 [2024-11-08 04:51:47.189773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.297 [2024-11-08 04:51:47.189803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.189881] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.297 [2024-11-08 04:51:47.189905] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.190030] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.297 [2024-11-08 04:51:47.190057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.190173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.297 [2024-11-08 04:51:47.190197] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.190314] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:12.297 [2024-11-08 04:51:47.190336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.297 #38 NEW cov: 11855 ft: 14352 corp: 28/2275b lim: 100 exec/s: 38 rss: 69Mb L: 100/100 MS: 1 ChangeBit- 00:08:12.297 [2024-11-08 04:51:47.249673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.297 [2024-11-08 04:51:47.249707] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.249824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.297 [2024-11-08 04:51:47.249844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.249972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.297 [2024-11-08 04:51:47.249994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.297 #39 NEW cov: 11855 ft: 14370 corp: 29/2351b lim: 100 exec/s: 39 rss: 69Mb L: 76/100 MS: 1 EraseBytes- 00:08:12.297 [2024-11-08 04:51:47.310265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.297 [2024-11-08 04:51:47.310299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.310418] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.297 [2024-11-08 04:51:47.310436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.310555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.297 [2024-11-08 04:51:47.310577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.310699] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.297 [2024-11-08 04:51:47.310720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.310840] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:4 nsid:0 00:08:12.297 [2024-11-08 04:51:47.310866] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:12.297 #40 NEW cov: 11855 ft: 14416 corp: 30/2451b lim: 100 exec/s: 40 rss: 70Mb L: 100/100 MS: 1 CopyPart- 00:08:12.297 [2024-11-08 04:51:47.369852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.297 [2024-11-08 04:51:47.369887] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.297 [2024-11-08 04:51:47.370010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.297 [2024-11-08 04:51:47.370036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.297 #41 NEW cov: 11855 ft: 14445 corp: 31/2509b lim: 100 exec/s: 41 rss: 70Mb L: 58/100 MS: 1 PersAutoDict- DE: "\003\000\000\000"- 00:08:12.556 [2024-11-08 04:51:47.430156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.556 [2024-11-08 04:51:47.430181] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.430304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.556 [2024-11-08 04:51:47.430325] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.430452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.556 [2024-11-08 04:51:47.430475] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.556 #42 NEW cov: 11855 ft: 14464 corp: 32/2574b lim: 100 exec/s: 42 rss: 70Mb L: 65/100 MS: 1 EraseBytes- 00:08:12.556 [2024-11-08 04:51:47.480419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.556 [2024-11-08 04:51:47.480449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.480544] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.556 [2024-11-08 04:51:47.480580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.480695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.556 [2024-11-08 04:51:47.480718] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.480839] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.556 [2024-11-08 04:51:47.480862] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.556 #43 NEW cov: 11855 ft: 14478 corp: 33/2673b lim: 100 exec/s: 43 rss: 70Mb L: 99/100 MS: 1 CrossOver- 00:08:12.556 [2024-11-08 04:51:47.530416] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.556 [2024-11-08 04:51:47.530446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.556 [2024-11-08 04:51:47.530559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.556 [2024-11-08 04:51:47.530593] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.530705] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.557 [2024-11-08 04:51:47.530725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.557 #44 NEW cov: 11855 ft: 14486 corp: 34/2738b lim: 100 exec/s: 44 rss: 70Mb L: 65/100 MS: 1 ChangeBit- 00:08:12.557 [2024-11-08 04:51:47.580754] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.557 [2024-11-08 04:51:47.580783] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.580870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.557 [2024-11-08 04:51:47.580895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.581016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.557 [2024-11-08 04:51:47.581037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.581159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:3 nsid:0 00:08:12.557 [2024-11-08 04:51:47.581185] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:12.557 #45 NEW cov: 11855 ft: 14493 corp: 35/2832b lim: 100 exec/s: 45 rss: 70Mb L: 94/100 MS: 1 ShuffleBytes- 00:08:12.557 [2024-11-08 04:51:47.630774] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:0 nsid:0 00:08:12.557 [2024-11-08 04:51:47.630807] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.630946] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:1 nsid:0 00:08:12.557 [2024-11-08 04:51:47.630970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:12.557 [2024-11-08 04:51:47.631089] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: WRITE ZEROES (08) sqid:1 cid:2 nsid:0 00:08:12.557 [2024-11-08 04:51:47.631110] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:12.557 #46 NEW cov: 11855 ft: 14522 corp: 36/2895b lim: 100 exec/s: 23 rss: 70Mb L: 63/100 MS: 1 EraseBytes- 00:08:12.557 #46 DONE cov: 11855 ft: 14522 corp: 36/2895b lim: 100 exec/s: 23 rss: 70Mb 00:08:12.557 ###### Recommended dictionary. ###### 00:08:12.557 "\003\000\000\000" # Uses: 1 00:08:12.557 ###### End of recommended dictionary. ###### 00:08:12.557 Done 46 runs in 2 second(s) 00:08:12.817 04:51:47 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_18.conf 00:08:12.817 04:51:47 -- ../common.sh@72 -- # (( i++ )) 00:08:12.817 04:51:47 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:12.817 04:51:47 -- ../common.sh@73 -- # start_llvm_fuzz 19 1 0x1 00:08:12.817 04:51:47 -- nvmf/run.sh@23 -- # local fuzzer_type=19 00:08:12.817 04:51:47 -- nvmf/run.sh@24 -- # local timen=1 00:08:12.817 04:51:47 -- nvmf/run.sh@25 -- # local core=0x1 00:08:12.817 04:51:47 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:12.817 04:51:47 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_19.conf 00:08:12.817 04:51:47 -- nvmf/run.sh@29 -- # printf %02d 19 00:08:12.817 04:51:47 -- nvmf/run.sh@29 -- # port=4419 00:08:12.817 04:51:47 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:12.817 04:51:47 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' 00:08:12.817 04:51:47 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4419"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:12.817 04:51:47 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4419' -c /tmp/fuzz_json_19.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 -Z 19 -r /var/tmp/spdk19.sock 00:08:12.817 [2024-11-08 04:51:47.815590] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:12.817 [2024-11-08 04:51:47.815659] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3687377 ] 00:08:12.817 EAL: No free 2048 kB hugepages reported on node 1 00:08:13.076 [2024-11-08 04:51:47.991418] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.076 [2024-11-08 04:51:48.054272] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:13.076 [2024-11-08 04:51:48.054393] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.076 [2024-11-08 04:51:48.112383] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:13.076 [2024-11-08 04:51:48.128654] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4419 *** 00:08:13.076 INFO: Running with entropic power schedule (0xFF, 100). 00:08:13.076 INFO: Seed: 1473097156 00:08:13.076 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:13.076 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:13.076 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_19 00:08:13.076 INFO: A corpus is not provided, starting from an empty corpus 00:08:13.076 #2 INITED exec/s: 0 rss: 60Mb 00:08:13.076 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:13.076 This may also happen if the target rejected all inputs we tried so far 00:08:13.076 [2024-11-08 04:51:48.173878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.076 [2024-11-08 04:51:48.173908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.076 [2024-11-08 04:51:48.173962] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.076 [2024-11-08 04:51:48.173979] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 NEW_FUNC[1/670]: 0x45ae18 in fuzz_nvm_write_uncorrectable_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:582 00:08:13.594 NEW_FUNC[2/670]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:13.594 #4 NEW cov: 11606 ft: 11607 corp: 2/25b lim: 50 exec/s: 0 rss: 68Mb L: 24/24 MS: 2 CrossOver-InsertRepeatedBytes- 00:08:13.594 [2024-11-08 04:51:48.484701] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.484741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.484804] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.484823] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 #5 NEW cov: 11719 ft: 12157 corp: 3/49b lim: 50 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:13.594 [2024-11-08 04:51:48.534725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.534755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.534810] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.534825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 #6 NEW cov: 11725 ft: 12485 corp: 4/73b lim: 50 exec/s: 0 rss: 68Mb L: 24/24 MS: 1 ShuffleBytes- 00:08:13.594 [2024-11-08 04:51:48.575177] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4195730024608447034 len:14907 00:08:13.594 [2024-11-08 04:51:48.575211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.575248] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4195730024608447034 len:14907 00:08:13.594 [2024-11-08 04:51:48.575264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.575319] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:14907 00:08:13.594 [2024-11-08 04:51:48.575334] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.575388] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4195730024608447034 len:14907 00:08:13.594 [2024-11-08 04:51:48.575404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.575455] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:4195730024608447034 len:14859 00:08:13.594 [2024-11-08 04:51:48.575471] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:13.594 #7 NEW cov: 11810 ft: 13036 corp: 5/123b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:13.594 [2024-11-08 04:51:48.614938] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.614965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.615021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.615037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 #8 NEW cov: 11810 ft: 13113 corp: 6/147b lim: 50 exec/s: 0 rss: 68Mb L: 24/50 MS: 1 ShuffleBytes- 00:08:13.594 [2024-11-08 04:51:48.655008] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.655036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.655095] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:134140418588672 len:1 00:08:13.594 [2024-11-08 04:51:48.655112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.594 #9 NEW cov: 11810 ft: 13179 corp: 7/171b lim: 50 exec/s: 0 rss: 68Mb L: 24/50 MS: 1 ChangeByte- 00:08:13.594 [2024-11-08 04:51:48.695140] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.594 [2024-11-08 04:51:48.695169] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.594 [2024-11-08 04:51:48.695225] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11520 len:1 00:08:13.594 [2024-11-08 04:51:48.695243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.854 #10 NEW cov: 11810 ft: 13248 corp: 8/196b lim: 50 exec/s: 0 rss: 68Mb L: 25/50 MS: 1 InsertByte- 00:08:13.854 [2024-11-08 04:51:48.735510] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:13.854 [2024-11-08 04:51:48.735543] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.735590] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.735605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.735660] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.735677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.735748] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.735764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.854 #11 NEW cov: 11810 ft: 13324 corp: 9/244b lim: 50 exec/s: 0 rss: 68Mb L: 48/50 MS: 1 InsertRepeatedBytes- 00:08:13.854 [2024-11-08 04:51:48.775490] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:13.854 [2024-11-08 04:51:48.775518] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.775561] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:13.854 [2024-11-08 04:51:48.775577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.775632] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65291 00:08:13.854 [2024-11-08 04:51:48.775647] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.854 #12 NEW cov: 11810 ft: 13567 corp: 10/274b lim: 50 exec/s: 0 rss: 68Mb L: 30/50 MS: 1 InsertRepeatedBytes- 00:08:13.854 [2024-11-08 04:51:48.815492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.815520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.815574] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:122 len:1 00:08:13.854 [2024-11-08 04:51:48.815590] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.854 #13 NEW cov: 11810 ft: 13708 corp: 11/298b lim: 50 exec/s: 0 rss: 68Mb L: 24/50 MS: 1 ShuffleBytes- 00:08:13.854 [2024-11-08 04:51:48.855741] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.855771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.855807] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.854 [2024-11-08 04:51:48.855822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.854 [2024-11-08 04:51:48.855878] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:31232 len:1 00:08:13.854 [2024-11-08 04:51:48.855893] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.855 #14 NEW cov: 11810 ft: 13722 corp: 12/331b lim: 50 exec/s: 0 rss: 69Mb L: 33/50 MS: 1 InsertRepeatedBytes- 00:08:13.855 [2024-11-08 04:51:48.895864] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:13.855 [2024-11-08 04:51:48.895891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.895953] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:13.855 [2024-11-08 04:51:48.895968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.896021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:134140418588672 len:1 00:08:13.855 [2024-11-08 04:51:48.896037] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.855 #15 NEW cov: 11810 ft: 13740 corp: 13/368b lim: 50 exec/s: 0 rss: 69Mb L: 37/50 MS: 1 InsertRepeatedBytes- 00:08:13.855 [2024-11-08 04:51:48.936250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:13.855 [2024-11-08 04:51:48.936277] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.936327] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:13.855 [2024-11-08 04:51:48.936342] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.936397] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:13.855 [2024-11-08 04:51:48.936413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.936467] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 00:08:13.855 [2024-11-08 04:51:48.936483] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:13.855 [2024-11-08 04:51:48.936541] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65291 00:08:13.855 [2024-11-08 04:51:48.936558] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.113 #16 NEW cov: 11810 ft: 13818 corp: 14/418b lim: 50 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:14.113 [2024-11-08 04:51:48.986042] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.113 [2024-11-08 04:51:48.986070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.113 [2024-11-08 04:51:48.986106] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.113 [2024-11-08 04:51:48.986121] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.113 #17 NEW cov: 11810 ft: 13859 corp: 15/442b lim: 50 exec/s: 0 rss: 69Mb L: 24/50 MS: 1 ShuffleBytes- 00:08:14.113 [2024-11-08 04:51:49.026147] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:24 len:1 00:08:14.113 [2024-11-08 04:51:49.026174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.113 [2024-11-08 04:51:49.026228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.113 [2024-11-08 04:51:49.026243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.114 #23 NEW cov: 11810 ft: 13897 corp: 16/466b lim: 50 exec/s: 0 rss: 69Mb L: 24/50 MS: 1 ChangeBinInt- 00:08:14.114 [2024-11-08 04:51:49.066360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.114 [2024-11-08 04:51:49.066387] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.066423] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.114 [2024-11-08 04:51:49.066439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.066492] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8791026472627208192 len:1 00:08:14.114 [2024-11-08 04:51:49.066508] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.114 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:14.114 #24 NEW cov: 11833 ft: 13942 corp: 17/498b lim: 50 exec/s: 0 rss: 69Mb L: 32/50 MS: 1 CrossOver- 00:08:14.114 [2024-11-08 04:51:49.106366] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.114 [2024-11-08 04:51:49.106393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.106435] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11520 len:1 00:08:14.114 [2024-11-08 04:51:49.106451] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.114 #25 NEW cov: 11833 ft: 13953 corp: 18/524b lim: 50 exec/s: 0 rss: 69Mb L: 26/50 MS: 1 InsertByte- 00:08:14.114 [2024-11-08 04:51:49.146616] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:14.114 [2024-11-08 04:51:49.146643] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.146681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551538 len:65536 00:08:14.114 [2024-11-08 04:51:49.146697] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.146751] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:14.114 [2024-11-08 04:51:49.146767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.114 #26 NEW cov: 11833 ft: 13971 corp: 19/555b lim: 50 exec/s: 26 rss: 69Mb L: 31/50 MS: 1 InsertByte- 00:08:14.114 [2024-11-08 04:51:49.186725] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.114 [2024-11-08 04:51:49.186752] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.186788] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.114 [2024-11-08 04:51:49.186803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.114 [2024-11-08 04:51:49.186856] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:34339947158700174 len:1 00:08:14.114 [2024-11-08 04:51:49.186872] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.114 #27 NEW cov: 11833 ft: 13989 corp: 20/588b lim: 50 exec/s: 27 rss: 69Mb L: 33/50 MS: 1 InsertByte- 00:08:14.372 [2024-11-08 04:51:49.226995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:1224979098644774912 len:1 00:08:14.372 [2024-11-08 04:51:49.227022] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.227067] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.372 [2024-11-08 04:51:49.227084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.227138] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:14.372 [2024-11-08 04:51:49.227165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.227219] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2046820352 len:1 00:08:14.372 [2024-11-08 04:51:49.227235] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.372 #28 NEW cov: 11833 ft: 14002 corp: 21/629b lim: 50 exec/s: 28 rss: 69Mb L: 41/50 MS: 1 CMP- DE: "\021\000\000\000"- 00:08:14.372 [2024-11-08 04:51:49.267107] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:14.372 [2024-11-08 04:51:49.267135] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.267193] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:7133701809754865664 len:1 00:08:14.372 [2024-11-08 04:51:49.267210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.267263] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:14.372 [2024-11-08 04:51:49.267279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.372 [2024-11-08 04:51:49.267334] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:14.372 [2024-11-08 04:51:49.267348] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.373 #29 NEW cov: 11833 ft: 14017 corp: 22/677b lim: 50 exec/s: 29 rss: 69Mb L: 48/50 MS: 1 ChangeByte- 00:08:14.373 [2024-11-08 04:51:49.307086] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.307113] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.307151] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.307167] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.307220] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:31232 len:1 00:08:14.373 [2024-11-08 04:51:49.307236] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.373 #30 NEW cov: 11833 ft: 14048 corp: 23/710b lim: 50 exec/s: 30 rss: 69Mb L: 33/50 MS: 1 ShuffleBytes- 00:08:14.373 [2024-11-08 04:51:49.347058] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444032 len:1 00:08:14.373 [2024-11-08 04:51:49.347085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.347139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.347156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.373 #31 NEW cov: 11833 ft: 14053 corp: 24/734b lim: 50 exec/s: 31 rss: 69Mb L: 24/50 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:08:14.373 [2024-11-08 04:51:49.387295] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.387324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.387360] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.387376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.387432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:31232 len:1 00:08:14.373 [2024-11-08 04:51:49.387448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.373 #32 NEW cov: 11833 ft: 14086 corp: 25/767b lim: 50 exec/s: 32 rss: 69Mb L: 33/50 MS: 1 ShuffleBytes- 00:08:14.373 [2024-11-08 04:51:49.427354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444032 len:1 00:08:14.373 [2024-11-08 04:51:49.427381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.427432] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.373 [2024-11-08 04:51:49.427448] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.373 #33 NEW cov: 11833 ft: 14116 corp: 26/791b lim: 50 exec/s: 33 rss: 70Mb L: 24/50 MS: 1 CopyPart- 00:08:14.373 [2024-11-08 04:51:49.467556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709223935 len:65536 00:08:14.373 [2024-11-08 04:51:49.467584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.467622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:14.373 [2024-11-08 04:51:49.467639] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.373 [2024-11-08 04:51:49.467694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65291 00:08:14.373 [2024-11-08 04:51:49.467710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 #34 NEW cov: 11833 ft: 14130 corp: 27/821b lim: 50 exec/s: 34 rss: 70Mb L: 30/50 MS: 1 ChangeBinInt- 00:08:14.637 [2024-11-08 04:51:49.507637] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:768 len:1 00:08:14.637 [2024-11-08 04:51:49.507665] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.507702] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.507717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.507770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.507786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 #35 NEW cov: 11833 ft: 14167 corp: 28/853b lim: 50 exec/s: 35 rss: 70Mb L: 32/50 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:14.637 [2024-11-08 04:51:49.547716] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444056 len:1 00:08:14.637 [2024-11-08 04:51:49.547743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.547814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.547830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 #36 NEW cov: 11833 ft: 14187 corp: 29/877b lim: 50 exec/s: 36 rss: 70Mb L: 24/50 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:08:14.637 [2024-11-08 04:51:49.588164] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4195730024608447034 len:14907 00:08:14.637 [2024-11-08 04:51:49.588191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.588241] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4248084370276629050 len:14907 00:08:14.637 [2024-11-08 04:51:49.588257] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.588313] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:14907 00:08:14.637 [2024-11-08 04:51:49.588329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.588382] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4195730024608447034 len:14907 00:08:14.637 [2024-11-08 04:51:49.588397] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.588454] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:4195730024608447034 len:14859 00:08:14.637 [2024-11-08 04:51:49.588469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.637 #37 NEW cov: 11833 ft: 14229 corp: 30/927b lim: 50 exec/s: 37 rss: 70Mb L: 50/50 MS: 1 ChangeByte- 00:08:14.637 [2024-11-08 04:51:49.628131] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444032 len:1 00:08:14.637 [2024-11-08 04:51:49.628159] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.628204] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.628220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.628272] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.628289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.628345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:2046820352 len:1 00:08:14.637 [2024-11-08 04:51:49.628360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.637 #38 NEW cov: 11833 ft: 14234 corp: 31/968b lim: 50 exec/s: 38 rss: 70Mb L: 41/50 MS: 1 PersAutoDict- DE: "\021\000\000\000"- 00:08:14.637 [2024-11-08 04:51:49.668130] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:768 len:1 00:08:14.637 [2024-11-08 04:51:49.668158] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.668192] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.668211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.668266] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.668283] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 #39 NEW cov: 11833 ft: 14258 corp: 32/1000b lim: 50 exec/s: 39 rss: 70Mb L: 32/50 MS: 1 ShuffleBytes- 00:08:14.637 [2024-11-08 04:51:49.708289] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444032 len:1 00:08:14.637 [2024-11-08 04:51:49.708318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.708354] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.637 [2024-11-08 04:51:49.708371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.637 [2024-11-08 04:51:49.708424] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744069414584330 len:65536 00:08:14.637 [2024-11-08 04:51:49.708440] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.637 #40 NEW cov: 11833 ft: 14276 corp: 33/1030b lim: 50 exec/s: 40 rss: 70Mb L: 30/50 MS: 1 InsertRepeatedBytes- 00:08:14.900 [2024-11-08 04:51:49.748405] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:8192 len:1 00:08:14.900 [2024-11-08 04:51:49.748433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.900 [2024-11-08 04:51:49.748471] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.900 [2024-11-08 04:51:49.748488] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.900 [2024-11-08 04:51:49.748546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:34339947158700174 len:1 00:08:14.900 [2024-11-08 04:51:49.748563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.900 #41 NEW cov: 11833 ft: 14297 corp: 34/1063b lim: 50 exec/s: 41 rss: 70Mb L: 33/50 MS: 1 ChangeBit- 00:08:14.900 [2024-11-08 04:51:49.788605] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:73014444032 len:1 00:08:14.900 [2024-11-08 04:51:49.788634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.900 [2024-11-08 04:51:49.788676] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.900 [2024-11-08 04:51:49.788692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.900 [2024-11-08 04:51:49.788745] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:18 00:08:14.900 [2024-11-08 04:51:49.788761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.900 [2024-11-08 04:51:49.788814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:14.900 [2024-11-08 04:51:49.788829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.900 #42 NEW cov: 11833 ft: 14305 corp: 35/1110b lim: 50 exec/s: 42 rss: 70Mb L: 47/50 MS: 1 CopyPart- 00:08:14.900 [2024-11-08 04:51:49.828824] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:549772591104 len:14907 00:08:14.900 [2024-11-08 04:51:49.828857] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.828894] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:4248084370276629050 len:14907 00:08:14.901 [2024-11-08 04:51:49.828908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.828961] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:4195730024608447034 len:14907 00:08:14.901 [2024-11-08 04:51:49.828976] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.829046] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:4195730024608447034 len:14907 00:08:14.901 [2024-11-08 04:51:49.829062] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.829118] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:4 nsid:0 lba:4195730024608447034 len:14859 00:08:14.901 [2024-11-08 04:51:49.829134] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:14.901 #43 NEW cov: 11833 ft: 14315 corp: 36/1160b lim: 50 exec/s: 43 rss: 70Mb L: 50/50 MS: 1 CMP- DE: "\001\000\000\000\000\000\000\200"- 00:08:14.901 [2024-11-08 04:51:49.868608] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:14.901 [2024-11-08 04:51:49.868636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.868690] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:11520 len:1 00:08:14.901 [2024-11-08 04:51:49.868705] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.901 #44 NEW cov: 11833 ft: 14331 corp: 37/1186b lim: 50 exec/s: 44 rss: 70Mb L: 26/50 MS: 1 CrossOver- 00:08:14.901 [2024-11-08 04:51:49.908623] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:768 len:1 00:08:14.901 [2024-11-08 04:51:49.908652] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.901 #45 NEW cov: 11833 ft: 14645 corp: 38/1205b lim: 50 exec/s: 45 rss: 70Mb L: 19/50 MS: 1 EraseBytes- 00:08:14.901 [2024-11-08 04:51:49.948990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:4195730023631552570 len:14849 00:08:14.901 [2024-11-08 04:51:49.949018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.949071] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:14.901 [2024-11-08 04:51:49.949088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.949144] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:8791026472627208192 len:1 00:08:14.901 [2024-11-08 04:51:49.949160] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.901 #46 NEW cov: 11833 ft: 14651 corp: 39/1237b lim: 50 exec/s: 46 rss: 70Mb L: 32/50 MS: 1 CrossOver- 00:08:14.901 [2024-11-08 04:51:49.989161] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 00:08:14.901 [2024-11-08 04:51:49.989189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.989236] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 00:08:14.901 [2024-11-08 04:51:49.989253] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.989310] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 00:08:14.901 [2024-11-08 04:51:49.989327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:14.901 [2024-11-08 04:51:49.989385] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65291 00:08:14.901 [2024-11-08 04:51:49.989402] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:14.901 #47 NEW cov: 11833 ft: 14653 corp: 40/1277b lim: 50 exec/s: 47 rss: 70Mb L: 40/50 MS: 1 InsertRepeatedBytes- 00:08:15.160 [2024-11-08 04:51:50.029752] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.029786] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.029842] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:609885356032 len:123 00:08:15.160 [2024-11-08 04:51:50.029859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.160 #48 NEW cov: 11833 ft: 14698 corp: 41/1304b lim: 50 exec/s: 48 rss: 70Mb L: 27/50 MS: 1 EraseBytes- 00:08:15.160 [2024-11-08 04:51:50.069326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:768 len:1 00:08:15.160 [2024-11-08 04:51:50.069359] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.069403] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.069420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.069476] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.069491] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.160 #49 NEW cov: 11833 ft: 14788 corp: 42/1336b lim: 50 exec/s: 49 rss: 70Mb L: 32/50 MS: 1 CopyPart- 00:08:15.160 [2024-11-08 04:51:50.109566] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:167772160 len:1 00:08:15.160 [2024-11-08 04:51:50.109598] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.109635] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:2 00:08:15.160 [2024-11-08 04:51:50.109651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.109704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:100663296 len:1 00:08:15.160 [2024-11-08 04:51:50.109720] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.109776] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:3 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.109792] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.160 #50 NEW cov: 11833 ft: 14849 corp: 43/1384b lim: 50 exec/s: 50 rss: 70Mb L: 48/50 MS: 1 ChangeBinInt- 00:08:15.160 [2024-11-08 04:51:50.149556] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:0 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.149586] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.149622] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:1 nsid:0 lba:0 len:1 00:08:15.160 [2024-11-08 04:51:50.149638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.160 [2024-11-08 04:51:50.149695] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: WRITE UNCORRECTABLE sqid:1 cid:2 nsid:0 lba:2046820352 len:1 00:08:15.160 [2024-11-08 04:51:50.149710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.160 #51 NEW cov: 11833 ft: 14858 corp: 44/1415b lim: 50 exec/s: 25 rss: 70Mb L: 31/50 MS: 1 EraseBytes- 00:08:15.160 #51 DONE cov: 11833 ft: 14858 corp: 44/1415b lim: 50 exec/s: 25 rss: 70Mb 00:08:15.160 ###### Recommended dictionary. ###### 00:08:15.160 "\021\000\000\000" # Uses: 3 00:08:15.160 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:15.160 "\001\000\000\000\000\000\000\200" # Uses: 0 00:08:15.160 ###### End of recommended dictionary. ###### 00:08:15.160 Done 51 runs in 2 second(s) 00:08:15.419 04:51:50 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_19.conf 00:08:15.419 04:51:50 -- ../common.sh@72 -- # (( i++ )) 00:08:15.419 04:51:50 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:15.419 04:51:50 -- ../common.sh@73 -- # start_llvm_fuzz 20 1 0x1 00:08:15.419 04:51:50 -- nvmf/run.sh@23 -- # local fuzzer_type=20 00:08:15.419 04:51:50 -- nvmf/run.sh@24 -- # local timen=1 00:08:15.419 04:51:50 -- nvmf/run.sh@25 -- # local core=0x1 00:08:15.419 04:51:50 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:15.419 04:51:50 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_20.conf 00:08:15.419 04:51:50 -- nvmf/run.sh@29 -- # printf %02d 20 00:08:15.419 04:51:50 -- nvmf/run.sh@29 -- # port=4420 00:08:15.419 04:51:50 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:15.419 04:51:50 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' 00:08:15.419 04:51:50 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4420"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:15.419 04:51:50 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4420' -c /tmp/fuzz_json_20.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 -Z 20 -r /var/tmp/spdk20.sock 00:08:15.419 [2024-11-08 04:51:50.346840] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:15.419 [2024-11-08 04:51:50.346923] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3687834 ] 00:08:15.419 EAL: No free 2048 kB hugepages reported on node 1 00:08:15.678 [2024-11-08 04:51:50.530781] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.678 [2024-11-08 04:51:50.596773] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:15.678 [2024-11-08 04:51:50.596901] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.678 [2024-11-08 04:51:50.655464] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:15.678 [2024-11-08 04:51:50.671799] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:08:15.678 INFO: Running with entropic power schedule (0xFF, 100). 00:08:15.678 INFO: Seed: 4018118439 00:08:15.678 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:15.678 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:15.678 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_20 00:08:15.678 INFO: A corpus is not provided, starting from an empty corpus 00:08:15.678 #2 INITED exec/s: 0 rss: 60Mb 00:08:15.678 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:15.678 This may also happen if the target rejected all inputs we tried so far 00:08:15.678 [2024-11-08 04:51:50.737300] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:15.678 [2024-11-08 04:51:50.737329] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:15.678 [2024-11-08 04:51:50.737366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:15.678 [2024-11-08 04:51:50.737381] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:15.678 [2024-11-08 04:51:50.737432] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:15.678 [2024-11-08 04:51:50.737446] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:15.678 [2024-11-08 04:51:50.737498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:15.678 [2024-11-08 04:51:50.737513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:15.937 NEW_FUNC[1/669]: 0x45c9d8 in fuzz_nvm_reservation_acquire_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:597 00:08:15.937 NEW_FUNC[2/669]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:15.937 #31 NEW cov: 11643 ft: 11665 corp: 2/80b lim: 90 exec/s: 0 rss: 68Mb L: 79/79 MS: 4 InsertByte-EraseBytes-ChangeBit-InsertRepeatedBytes- 00:08:16.230 [2024-11-08 04:51:51.058210] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.058255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.058326] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.058350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.058427] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.058444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.058499] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.230 [2024-11-08 04:51:51.058516] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.230 NEW_FUNC[1/3]: 0x1c59068 in spdk_thread_get_from_ctx /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:797 00:08:16.230 NEW_FUNC[2/3]: 0x1c59208 in spdk_thread_poll /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/thread/thread.c:1153 00:08:16.230 #32 NEW cov: 11777 ft: 12259 corp: 3/160b lim: 90 exec/s: 0 rss: 68Mb L: 80/80 MS: 1 InsertByte- 00:08:16.230 [2024-11-08 04:51:51.108184] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.108211] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.108247] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.108263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.108317] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.108331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.108382] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.230 [2024-11-08 04:51:51.108398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.230 #38 NEW cov: 11783 ft: 12491 corp: 4/239b lim: 90 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 ChangeByte- 00:08:16.230 [2024-11-08 04:51:51.148257] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.148285] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.148338] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.148353] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.148404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.148419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.148470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.230 [2024-11-08 04:51:51.148486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.230 #39 NEW cov: 11868 ft: 12804 corp: 5/318b lim: 90 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 CopyPart- 00:08:16.230 [2024-11-08 04:51:51.188383] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.188408] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.188470] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.188485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.188539] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.188555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.188609] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.230 [2024-11-08 04:51:51.188624] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.230 #40 NEW cov: 11868 ft: 12913 corp: 6/397b lim: 90 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 CopyPart- 00:08:16.230 [2024-11-08 04:51:51.228478] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.228504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.228558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.228575] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.228625] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.228640] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.228695] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.230 [2024-11-08 04:51:51.228710] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.230 #46 NEW cov: 11868 ft: 12987 corp: 7/476b lim: 90 exec/s: 0 rss: 68Mb L: 79/80 MS: 1 ShuffleBytes- 00:08:16.230 [2024-11-08 04:51:51.268608] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.230 [2024-11-08 04:51:51.268635] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.268679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.230 [2024-11-08 04:51:51.268694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.268746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.230 [2024-11-08 04:51:51.268761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.230 [2024-11-08 04:51:51.268812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.231 [2024-11-08 04:51:51.268827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.231 #47 NEW cov: 11868 ft: 13031 corp: 8/556b lim: 90 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ChangeBinInt- 00:08:16.231 [2024-11-08 04:51:51.308697] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.231 [2024-11-08 04:51:51.308724] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.231 [2024-11-08 04:51:51.308770] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.231 [2024-11-08 04:51:51.308785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.231 [2024-11-08 04:51:51.308836] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.231 [2024-11-08 04:51:51.308851] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.231 [2024-11-08 04:51:51.308903] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.231 [2024-11-08 04:51:51.308918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #48 NEW cov: 11868 ft: 13047 corp: 9/636b lim: 90 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ChangeBit- 00:08:16.494 [2024-11-08 04:51:51.348412] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.348439] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 #50 NEW cov: 11868 ft: 14067 corp: 10/659b lim: 90 exec/s: 0 rss: 69Mb L: 23/80 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:16.494 [2024-11-08 04:51:51.388920] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.388945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.389006] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.389021] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.389071] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.389088] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.389141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.389156] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #51 NEW cov: 11868 ft: 14154 corp: 11/739b lim: 90 exec/s: 0 rss: 69Mb L: 80/80 MS: 1 ChangeBit- 00:08:16.494 [2024-11-08 04:51:51.429050] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.429076] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.429122] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.429138] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.429188] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.429202] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.429254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.429267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #52 NEW cov: 11868 ft: 14239 corp: 12/818b lim: 90 exec/s: 0 rss: 69Mb L: 79/80 MS: 1 CrossOver- 00:08:16.494 [2024-11-08 04:51:51.469178] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.469204] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.469241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.469255] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.469305] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.469319] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.469370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.469385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #53 NEW cov: 11868 ft: 14254 corp: 13/907b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 CrossOver- 00:08:16.494 [2024-11-08 04:51:51.509288] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.509314] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.509356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.509371] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.509420] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.509435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.509488] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.509503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #59 NEW cov: 11868 ft: 14279 corp: 14/986b lim: 90 exec/s: 0 rss: 69Mb L: 79/89 MS: 1 ChangeBinInt- 00:08:16.494 [2024-11-08 04:51:51.549423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.549450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.549490] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.549504] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.549561] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.549576] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.549628] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.549642] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.494 #60 NEW cov: 11868 ft: 14334 corp: 15/1075b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 ShuffleBytes- 00:08:16.494 [2024-11-08 04:51:51.589519] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.494 [2024-11-08 04:51:51.589551] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.589597] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.494 [2024-11-08 04:51:51.589613] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.589679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.494 [2024-11-08 04:51:51.589695] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.494 [2024-11-08 04:51:51.589748] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.494 [2024-11-08 04:51:51.589764] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.754 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:16.754 #61 NEW cov: 11891 ft: 14444 corp: 16/1162b lim: 90 exec/s: 0 rss: 69Mb L: 87/89 MS: 1 InsertRepeatedBytes- 00:08:16.754 [2024-11-08 04:51:51.629622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.754 [2024-11-08 04:51:51.629650] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.629688] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.754 [2024-11-08 04:51:51.629702] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.629753] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.754 [2024-11-08 04:51:51.629768] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.629818] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.754 [2024-11-08 04:51:51.629835] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.754 #62 NEW cov: 11891 ft: 14496 corp: 17/1250b lim: 90 exec/s: 0 rss: 69Mb L: 88/89 MS: 1 InsertByte- 00:08:16.754 [2024-11-08 04:51:51.669743] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.754 [2024-11-08 04:51:51.669769] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.669816] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.754 [2024-11-08 04:51:51.669832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.669882] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.754 [2024-11-08 04:51:51.669898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.669948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.754 [2024-11-08 04:51:51.669963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.754 #63 NEW cov: 11891 ft: 14511 corp: 18/1339b lim: 90 exec/s: 0 rss: 69Mb L: 89/89 MS: 1 ShuffleBytes- 00:08:16.754 [2024-11-08 04:51:51.709847] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.754 [2024-11-08 04:51:51.709874] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.709911] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.754 [2024-11-08 04:51:51.709926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.709976] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.754 [2024-11-08 04:51:51.709991] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.710042] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.754 [2024-11-08 04:51:51.710057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.754 #64 NEW cov: 11891 ft: 14541 corp: 19/1418b lim: 90 exec/s: 64 rss: 69Mb L: 79/89 MS: 1 CopyPart- 00:08:16.754 [2024-11-08 04:51:51.749986] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.754 [2024-11-08 04:51:51.750013] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.754 [2024-11-08 04:51:51.750068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.754 [2024-11-08 04:51:51.750084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.750136] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.755 [2024-11-08 04:51:51.750151] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.750202] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.755 [2024-11-08 04:51:51.750216] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.755 #65 NEW cov: 11891 ft: 14565 corp: 20/1497b lim: 90 exec/s: 65 rss: 69Mb L: 79/89 MS: 1 ChangeByte- 00:08:16.755 [2024-11-08 04:51:51.790113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.755 [2024-11-08 04:51:51.790140] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.790180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.755 [2024-11-08 04:51:51.790195] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.790246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.755 [2024-11-08 04:51:51.790260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.790313] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.755 [2024-11-08 04:51:51.790328] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.755 #66 NEW cov: 11891 ft: 14574 corp: 21/1576b lim: 90 exec/s: 66 rss: 69Mb L: 79/89 MS: 1 ChangeBit- 00:08:16.755 [2024-11-08 04:51:51.830181] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:16.755 [2024-11-08 04:51:51.830209] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.830245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:16.755 [2024-11-08 04:51:51.830261] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.830311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:16.755 [2024-11-08 04:51:51.830326] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:16.755 [2024-11-08 04:51:51.830377] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:16.755 [2024-11-08 04:51:51.830392] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:16.755 #67 NEW cov: 11891 ft: 14592 corp: 22/1656b lim: 90 exec/s: 67 rss: 69Mb L: 80/89 MS: 1 InsertByte- 00:08:17.015 [2024-11-08 04:51:51.870274] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:51.870301] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.870346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:51.870362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.870429] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:51.870444] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.870496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:51.870512] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #68 NEW cov: 11891 ft: 14623 corp: 23/1741b lim: 90 exec/s: 68 rss: 69Mb L: 85/89 MS: 1 InsertRepeatedBytes- 00:08:17.015 [2024-11-08 04:51:51.910582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:51.910609] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.910651] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:51.910667] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.910718] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:51.910733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.910784] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:51.910798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.910851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:17.015 [2024-11-08 04:51:51.910865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.015 #69 NEW cov: 11891 ft: 14669 corp: 24/1831b lim: 90 exec/s: 69 rss: 70Mb L: 90/90 MS: 1 CopyPart- 00:08:17.015 [2024-11-08 04:51:51.950552] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:51.950580] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.950637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:51.950653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.950703] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:51.950719] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.950773] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:51.950788] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #70 NEW cov: 11891 ft: 14687 corp: 25/1918b lim: 90 exec/s: 70 rss: 70Mb L: 87/90 MS: 1 ChangeBit- 00:08:17.015 [2024-11-08 04:51:51.990666] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:51.990692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.990728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:51.990743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.990795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:51.990810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:51.990861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:51.990876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #71 NEW cov: 11891 ft: 14717 corp: 26/2006b lim: 90 exec/s: 71 rss: 70Mb L: 88/90 MS: 1 ChangeBinInt- 00:08:17.015 [2024-11-08 04:51:52.030806] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:52.030832] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.030872] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:52.030886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.030936] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:52.030950] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.031003] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:52.031017] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #72 NEW cov: 11891 ft: 14733 corp: 27/2085b lim: 90 exec/s: 72 rss: 70Mb L: 79/90 MS: 1 ChangeByte- 00:08:17.015 [2024-11-08 04:51:52.060885] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:52.060912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.060966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:52.060982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.061034] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:52.061049] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.061100] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:52.061115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #73 NEW cov: 11891 ft: 14741 corp: 28/2174b lim: 90 exec/s: 73 rss: 70Mb L: 89/90 MS: 1 ChangeByte- 00:08:17.015 [2024-11-08 04:51:52.101021] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.015 [2024-11-08 04:51:52.101046] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.101088] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.015 [2024-11-08 04:51:52.101104] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.101156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.015 [2024-11-08 04:51:52.101172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.015 [2024-11-08 04:51:52.101224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.015 [2024-11-08 04:51:52.101238] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.015 #74 NEW cov: 11891 ft: 14762 corp: 29/2253b lim: 90 exec/s: 74 rss: 70Mb L: 79/90 MS: 1 ChangeBit- 00:08:17.275 [2024-11-08 04:51:52.131086] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.131112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.131153] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.131174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.131241] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.275 [2024-11-08 04:51:52.131256] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.131308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.275 [2024-11-08 04:51:52.131323] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.275 #75 NEW cov: 11891 ft: 14779 corp: 30/2342b lim: 90 exec/s: 75 rss: 70Mb L: 89/90 MS: 1 ChangeBit- 00:08:17.275 [2024-11-08 04:51:52.170908] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.170935] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.170989] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.171004] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 #76 NEW cov: 11891 ft: 15103 corp: 31/2390b lim: 90 exec/s: 76 rss: 70Mb L: 48/90 MS: 1 EraseBytes- 00:08:17.275 [2024-11-08 04:51:52.211301] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.211327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.211372] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.211388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.211439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.275 [2024-11-08 04:51:52.211453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.211505] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.275 [2024-11-08 04:51:52.211520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.275 #77 NEW cov: 11891 ft: 15118 corp: 32/2469b lim: 90 exec/s: 77 rss: 70Mb L: 79/90 MS: 1 ShuffleBytes- 00:08:17.275 [2024-11-08 04:51:52.241392] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.241418] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.241462] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.241478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.241531] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.275 [2024-11-08 04:51:52.241547] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.241613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.275 [2024-11-08 04:51:52.241636] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.275 #78 NEW cov: 11891 ft: 15124 corp: 33/2552b lim: 90 exec/s: 78 rss: 70Mb L: 83/90 MS: 1 CMP- DE: "\377\377\001\000"- 00:08:17.275 [2024-11-08 04:51:52.281516] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.281545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.281610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.281626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.281679] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.275 [2024-11-08 04:51:52.281694] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.281746] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.275 [2024-11-08 04:51:52.281761] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.275 #79 NEW cov: 11891 ft: 15132 corp: 34/2637b lim: 90 exec/s: 79 rss: 70Mb L: 85/90 MS: 1 InsertRepeatedBytes- 00:08:17.275 [2024-11-08 04:51:52.321643] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.275 [2024-11-08 04:51:52.321669] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.321716] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.275 [2024-11-08 04:51:52.321732] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.321783] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.275 [2024-11-08 04:51:52.321798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.275 [2024-11-08 04:51:52.321849] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.276 [2024-11-08 04:51:52.321865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.276 #80 NEW cov: 11891 ft: 15140 corp: 35/2726b lim: 90 exec/s: 80 rss: 70Mb L: 89/90 MS: 1 CopyPart- 00:08:17.276 [2024-11-08 04:51:52.361745] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.276 [2024-11-08 04:51:52.361771] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.276 [2024-11-08 04:51:52.361843] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.276 [2024-11-08 04:51:52.361859] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.276 [2024-11-08 04:51:52.361907] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.276 [2024-11-08 04:51:52.361921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.276 [2024-11-08 04:51:52.361970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.276 [2024-11-08 04:51:52.361984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.535 #81 NEW cov: 11891 ft: 15155 corp: 36/2805b lim: 90 exec/s: 81 rss: 70Mb L: 79/90 MS: 1 CMP- DE: "\377\377\377\011"- 00:08:17.535 [2024-11-08 04:51:52.401891] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.535 [2024-11-08 04:51:52.401916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.401956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.535 [2024-11-08 04:51:52.401970] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.402037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.535 [2024-11-08 04:51:52.402052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.402103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.535 [2024-11-08 04:51:52.402118] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.535 #82 NEW cov: 11891 ft: 15167 corp: 37/2891b lim: 90 exec/s: 82 rss: 70Mb L: 86/90 MS: 1 InsertByte- 00:08:17.535 [2024-11-08 04:51:52.441993] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.535 [2024-11-08 04:51:52.442018] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.442063] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.535 [2024-11-08 04:51:52.442078] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.442127] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.535 [2024-11-08 04:51:52.442141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.535 [2024-11-08 04:51:52.442191] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.535 [2024-11-08 04:51:52.442205] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.535 #83 NEW cov: 11891 ft: 15169 corp: 38/2980b lim: 90 exec/s: 83 rss: 70Mb L: 89/90 MS: 1 ChangeBinInt- 00:08:17.535 [2024-11-08 04:51:52.481975] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.536 [2024-11-08 04:51:52.482002] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.482037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.536 [2024-11-08 04:51:52.482052] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.482103] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.536 [2024-11-08 04:51:52.482117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.536 #87 NEW cov: 11891 ft: 15504 corp: 39/3037b lim: 90 exec/s: 87 rss: 70Mb L: 57/90 MS: 4 ChangeBit-ChangeByte-ChangeByte-CrossOver- 00:08:17.536 [2024-11-08 04:51:52.522346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.536 [2024-11-08 04:51:52.522372] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.522421] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.536 [2024-11-08 04:51:52.522436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.522484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.536 [2024-11-08 04:51:52.522502] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.522555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.536 [2024-11-08 04:51:52.522569] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.522620] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:4 nsid:0 00:08:17.536 [2024-11-08 04:51:52.522634] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:17.536 #88 NEW cov: 11891 ft: 15515 corp: 40/3127b lim: 90 exec/s: 88 rss: 70Mb L: 90/90 MS: 1 ShuffleBytes- 00:08:17.536 [2024-11-08 04:51:52.562332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.536 [2024-11-08 04:51:52.562358] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.562405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.536 [2024-11-08 04:51:52.562420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.562473] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.536 [2024-11-08 04:51:52.562486] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.562540] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.536 [2024-11-08 04:51:52.562555] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.536 #89 NEW cov: 11891 ft: 15521 corp: 41/3206b lim: 90 exec/s: 89 rss: 70Mb L: 79/90 MS: 1 InsertRepeatedBytes- 00:08:17.536 [2024-11-08 04:51:52.602446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.536 [2024-11-08 04:51:52.602472] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.602514] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.536 [2024-11-08 04:51:52.602534] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.602582] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.536 [2024-11-08 04:51:52.602597] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.602647] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.536 [2024-11-08 04:51:52.602661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.536 #90 NEW cov: 11891 ft: 15533 corp: 42/3285b lim: 90 exec/s: 90 rss: 70Mb L: 79/90 MS: 1 ChangeBinInt- 00:08:17.536 [2024-11-08 04:51:52.642601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.536 [2024-11-08 04:51:52.642627] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.642676] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.536 [2024-11-08 04:51:52.642692] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.642741] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.536 [2024-11-08 04:51:52.642759] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.536 [2024-11-08 04:51:52.642809] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.536 [2024-11-08 04:51:52.642824] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.796 #91 NEW cov: 11891 ft: 15550 corp: 43/3365b lim: 90 exec/s: 91 rss: 70Mb L: 80/90 MS: 1 ChangeBinInt- 00:08:17.796 [2024-11-08 04:51:52.682256] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.796 [2024-11-08 04:51:52.682282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.796 #92 NEW cov: 11891 ft: 15566 corp: 44/3390b lim: 90 exec/s: 92 rss: 70Mb L: 25/90 MS: 1 CrossOver- 00:08:17.796 [2024-11-08 04:51:52.722777] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:0 nsid:0 00:08:17.796 [2024-11-08 04:51:52.722803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:17.796 [2024-11-08 04:51:52.722850] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:1 nsid:0 00:08:17.796 [2024-11-08 04:51:52.722865] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:17.796 [2024-11-08 04:51:52.722913] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:2 nsid:0 00:08:17.796 [2024-11-08 04:51:52.722926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:17.796 [2024-11-08 04:51:52.722977] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION ACQUIRE (11) sqid:1 cid:3 nsid:0 00:08:17.796 [2024-11-08 04:51:52.722992] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:17.796 #93 NEW cov: 11891 ft: 15569 corp: 45/3478b lim: 90 exec/s: 46 rss: 70Mb L: 88/90 MS: 1 ChangeBit- 00:08:17.796 #93 DONE cov: 11891 ft: 15569 corp: 45/3478b lim: 90 exec/s: 46 rss: 70Mb 00:08:17.796 ###### Recommended dictionary. ###### 00:08:17.796 "\377\377\001\000" # Uses: 0 00:08:17.796 "\377\377\377\011" # Uses: 0 00:08:17.796 ###### End of recommended dictionary. ###### 00:08:17.796 Done 93 runs in 2 second(s) 00:08:17.796 04:51:52 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_20.conf 00:08:17.796 04:51:52 -- ../common.sh@72 -- # (( i++ )) 00:08:17.796 04:51:52 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:17.796 04:51:52 -- ../common.sh@73 -- # start_llvm_fuzz 21 1 0x1 00:08:17.796 04:51:52 -- nvmf/run.sh@23 -- # local fuzzer_type=21 00:08:17.796 04:51:52 -- nvmf/run.sh@24 -- # local timen=1 00:08:17.796 04:51:52 -- nvmf/run.sh@25 -- # local core=0x1 00:08:17.796 04:51:52 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:17.796 04:51:52 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_21.conf 00:08:17.796 04:51:52 -- nvmf/run.sh@29 -- # printf %02d 21 00:08:17.796 04:51:52 -- nvmf/run.sh@29 -- # port=4421 00:08:17.796 04:51:52 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:17.796 04:51:52 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' 00:08:17.796 04:51:52 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4421"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:17.796 04:51:52 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4421' -c /tmp/fuzz_json_21.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 -Z 21 -r /var/tmp/spdk21.sock 00:08:18.055 [2024-11-08 04:51:52.915393] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:18.055 [2024-11-08 04:51:52.915475] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3688216 ] 00:08:18.055 EAL: No free 2048 kB hugepages reported on node 1 00:08:18.055 [2024-11-08 04:51:53.095173] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.055 [2024-11-08 04:51:53.159959] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:18.055 [2024-11-08 04:51:53.160091] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.314 [2024-11-08 04:51:53.218264] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:18.314 [2024-11-08 04:51:53.234587] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4421 *** 00:08:18.314 INFO: Running with entropic power schedule (0xFF, 100). 00:08:18.315 INFO: Seed: 2284135837 00:08:18.315 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:18.315 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:18.315 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_21 00:08:18.315 INFO: A corpus is not provided, starting from an empty corpus 00:08:18.315 #2 INITED exec/s: 0 rss: 60Mb 00:08:18.315 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:18.315 This may also happen if the target rejected all inputs we tried so far 00:08:18.315 [2024-11-08 04:51:53.280024] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.315 [2024-11-08 04:51:53.280053] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.315 [2024-11-08 04:51:53.280092] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.315 [2024-11-08 04:51:53.280107] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.315 [2024-11-08 04:51:53.280159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.315 [2024-11-08 04:51:53.280174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.315 [2024-11-08 04:51:53.280224] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.315 [2024-11-08 04:51:53.280239] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.574 NEW_FUNC[1/672]: 0x45fc08 in fuzz_nvm_reservation_release_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:623 00:08:18.574 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:18.574 #3 NEW cov: 11639 ft: 11640 corp: 2/46b lim: 50 exec/s: 0 rss: 68Mb L: 45/45 MS: 1 InsertRepeatedBytes- 00:08:18.574 [2024-11-08 04:51:53.590910] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.574 [2024-11-08 04:51:53.590944] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.591010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.574 [2024-11-08 04:51:53.591027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.591085] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.574 [2024-11-08 04:51:53.591100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.591155] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.574 [2024-11-08 04:51:53.591173] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.574 #4 NEW cov: 11752 ft: 12135 corp: 3/94b lim: 50 exec/s: 0 rss: 68Mb L: 48/48 MS: 1 InsertRepeatedBytes- 00:08:18.574 [2024-11-08 04:51:53.640795] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.574 [2024-11-08 04:51:53.640822] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.640861] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.574 [2024-11-08 04:51:53.640876] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.640931] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.574 [2024-11-08 04:51:53.640945] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.574 #5 NEW cov: 11758 ft: 12696 corp: 4/131b lim: 50 exec/s: 0 rss: 68Mb L: 37/48 MS: 1 EraseBytes- 00:08:18.574 [2024-11-08 04:51:53.681130] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.574 [2024-11-08 04:51:53.681157] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.681208] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.574 [2024-11-08 04:51:53.681224] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.681281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.574 [2024-11-08 04:51:53.681298] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.574 [2024-11-08 04:51:53.681355] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.574 [2024-11-08 04:51:53.681370] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.834 #11 NEW cov: 11843 ft: 12980 corp: 5/177b lim: 50 exec/s: 0 rss: 68Mb L: 46/48 MS: 1 CrossOver- 00:08:18.834 [2024-11-08 04:51:53.720880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.720908] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.720951] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.834 [2024-11-08 04:51:53.720967] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.834 #19 NEW cov: 11843 ft: 13402 corp: 6/198b lim: 50 exec/s: 0 rss: 68Mb L: 21/48 MS: 3 ChangeBit-ChangeBit-InsertRepeatedBytes- 00:08:18.834 [2024-11-08 04:51:53.761312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.761337] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.761388] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.834 [2024-11-08 04:51:53.761404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.761458] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.834 [2024-11-08 04:51:53.761473] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.761535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.834 [2024-11-08 04:51:53.761549] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.834 #20 NEW cov: 11843 ft: 13443 corp: 7/244b lim: 50 exec/s: 0 rss: 68Mb L: 46/48 MS: 1 ChangeBit- 00:08:18.834 [2024-11-08 04:51:53.801568] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.801612] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.801657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.834 [2024-11-08 04:51:53.801673] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.801729] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.834 [2024-11-08 04:51:53.801745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.801799] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.834 [2024-11-08 04:51:53.801815] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.801870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:18.834 [2024-11-08 04:51:53.801886] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:18.834 #21 NEW cov: 11843 ft: 13559 corp: 8/294b lim: 50 exec/s: 0 rss: 68Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:18.834 [2024-11-08 04:51:53.841048] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.841075] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 #25 NEW cov: 11843 ft: 14407 corp: 9/304b lim: 50 exec/s: 0 rss: 68Mb L: 10/50 MS: 4 ChangeByte-CMP-CrossOver-InsertByte- DE: "\001\000\000\000\000\000\000\000"- 00:08:18.834 [2024-11-08 04:51:53.881311] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.881338] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.881380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.834 [2024-11-08 04:51:53.881395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.834 #26 NEW cov: 11843 ft: 14439 corp: 10/329b lim: 50 exec/s: 0 rss: 68Mb L: 25/50 MS: 1 CrossOver- 00:08:18.834 [2024-11-08 04:51:53.921739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:18.834 [2024-11-08 04:51:53.921765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.921813] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:18.834 [2024-11-08 04:51:53.921829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.921884] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:18.834 [2024-11-08 04:51:53.921898] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:18.834 [2024-11-08 04:51:53.921957] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:18.834 [2024-11-08 04:51:53.921971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.094 #32 NEW cov: 11843 ft: 14457 corp: 11/377b lim: 50 exec/s: 0 rss: 68Mb L: 48/50 MS: 1 CMP- DE: "\377\377\377\377"- 00:08:19.094 [2024-11-08 04:51:53.961905] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:53.961933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:53.961970] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:53.961986] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:53.962041] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.094 [2024-11-08 04:51:53.962056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:53.962114] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.094 [2024-11-08 04:51:53.962129] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.094 #33 NEW cov: 11843 ft: 14534 corp: 12/423b lim: 50 exec/s: 0 rss: 68Mb L: 46/50 MS: 1 ShuffleBytes- 00:08:19.094 [2024-11-08 04:51:54.002156] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.002184] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.002235] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.002250] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.002304] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.094 [2024-11-08 04:51:54.002318] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.002371] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.094 [2024-11-08 04:51:54.002386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.002439] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:19.094 [2024-11-08 04:51:54.002453] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.094 #34 NEW cov: 11843 ft: 14586 corp: 13/473b lim: 50 exec/s: 0 rss: 69Mb L: 50/50 MS: 1 ShuffleBytes- 00:08:19.094 [2024-11-08 04:51:54.041815] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.041844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.041894] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.041909] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 #35 NEW cov: 11843 ft: 14605 corp: 14/498b lim: 50 exec/s: 0 rss: 69Mb L: 25/50 MS: 1 ChangeBinInt- 00:08:19.094 [2024-11-08 04:51:54.081877] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.081906] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.081971] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.081988] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 #36 NEW cov: 11843 ft: 14655 corp: 15/523b lim: 50 exec/s: 0 rss: 69Mb L: 25/50 MS: 1 ShuffleBytes- 00:08:19.094 [2024-11-08 04:51:54.122162] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.122189] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.122228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.122244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.122315] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.094 [2024-11-08 04:51:54.122331] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.094 #37 NEW cov: 11843 ft: 14716 corp: 16/560b lim: 50 exec/s: 0 rss: 69Mb L: 37/50 MS: 1 ShuffleBytes- 00:08:19.094 [2024-11-08 04:51:54.162443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.162470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.162517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.162537] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.162589] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.094 [2024-11-08 04:51:54.162605] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.162662] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.094 [2024-11-08 04:51:54.162677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.094 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:19.094 #38 NEW cov: 11866 ft: 14815 corp: 17/606b lim: 50 exec/s: 0 rss: 69Mb L: 46/50 MS: 1 ChangeBit- 00:08:19.094 [2024-11-08 04:51:54.202310] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.094 [2024-11-08 04:51:54.202336] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.094 [2024-11-08 04:51:54.202375] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.094 [2024-11-08 04:51:54.202390] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.354 #39 NEW cov: 11866 ft: 14826 corp: 18/635b lim: 50 exec/s: 0 rss: 69Mb L: 29/50 MS: 1 PersAutoDict- DE: "\377\377\377\377"- 00:08:19.354 [2024-11-08 04:51:54.242712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.242739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.242788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.354 [2024-11-08 04:51:54.242806] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.242862] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.354 [2024-11-08 04:51:54.242877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.242932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.354 [2024-11-08 04:51:54.242947] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.354 #40 NEW cov: 11866 ft: 14875 corp: 19/681b lim: 50 exec/s: 0 rss: 69Mb L: 46/50 MS: 1 ChangeByte- 00:08:19.354 [2024-11-08 04:51:54.282992] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.283019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.283069] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.354 [2024-11-08 04:51:54.283084] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.283137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.354 [2024-11-08 04:51:54.283152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.283206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.354 [2024-11-08 04:51:54.283220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.283275] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:19.354 [2024-11-08 04:51:54.283289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.354 #41 NEW cov: 11866 ft: 14951 corp: 20/731b lim: 50 exec/s: 41 rss: 69Mb L: 50/50 MS: 1 CopyPart- 00:08:19.354 [2024-11-08 04:51:54.322619] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.322646] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.322712] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.354 [2024-11-08 04:51:54.322728] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.354 #42 NEW cov: 11866 ft: 15042 corp: 21/757b lim: 50 exec/s: 42 rss: 69Mb L: 26/50 MS: 1 EraseBytes- 00:08:19.354 [2024-11-08 04:51:54.362610] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.362637] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 #43 NEW cov: 11866 ft: 15059 corp: 22/772b lim: 50 exec/s: 43 rss: 69Mb L: 15/50 MS: 1 InsertRepeatedBytes- 00:08:19.354 [2024-11-08 04:51:54.402728] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.402755] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 #44 NEW cov: 11866 ft: 15073 corp: 23/788b lim: 50 exec/s: 44 rss: 69Mb L: 16/50 MS: 1 InsertByte- 00:08:19.354 [2024-11-08 04:51:54.442994] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.354 [2024-11-08 04:51:54.443023] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.354 [2024-11-08 04:51:54.443079] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.354 [2024-11-08 04:51:54.443096] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 #45 NEW cov: 11866 ft: 15083 corp: 24/814b lim: 50 exec/s: 45 rss: 69Mb L: 26/50 MS: 1 ChangeBit- 00:08:19.614 [2024-11-08 04:51:54.483410] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.483436] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.483477] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.614 [2024-11-08 04:51:54.483492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.483548] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.614 [2024-11-08 04:51:54.483564] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.483616] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.614 [2024-11-08 04:51:54.483630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.614 #46 NEW cov: 11866 ft: 15087 corp: 25/862b lim: 50 exec/s: 46 rss: 69Mb L: 48/50 MS: 1 ChangeBinInt- 00:08:19.614 [2024-11-08 04:51:54.523424] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.523450] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.523498] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.614 [2024-11-08 04:51:54.523513] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.523571] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.614 [2024-11-08 04:51:54.523584] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.523639] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.614 [2024-11-08 04:51:54.523654] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.614 #47 NEW cov: 11866 ft: 15134 corp: 26/907b lim: 50 exec/s: 47 rss: 69Mb L: 45/50 MS: 1 PersAutoDict- DE: "\001\000\000\000\000\000\000\000"- 00:08:19.614 [2024-11-08 04:51:54.563476] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.563503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.563547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.614 [2024-11-08 04:51:54.563563] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.563617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.614 [2024-11-08 04:51:54.563632] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.614 #48 NEW cov: 11866 ft: 15213 corp: 27/938b lim: 50 exec/s: 48 rss: 69Mb L: 31/50 MS: 1 CrossOver- 00:08:19.614 [2024-11-08 04:51:54.603921] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.603948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.604000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.614 [2024-11-08 04:51:54.604015] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.604068] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.614 [2024-11-08 04:51:54.604082] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.604138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.614 [2024-11-08 04:51:54.604152] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.604206] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:4 nsid:0 00:08:19.614 [2024-11-08 04:51:54.604221] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:19.614 #49 NEW cov: 11866 ft: 15225 corp: 28/988b lim: 50 exec/s: 49 rss: 69Mb L: 50/50 MS: 1 InsertRepeatedBytes- 00:08:19.614 [2024-11-08 04:51:54.643863] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.643891] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.643932] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.614 [2024-11-08 04:51:54.643948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.644004] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.614 [2024-11-08 04:51:54.644019] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.644074] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.614 [2024-11-08 04:51:54.644089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.614 #50 NEW cov: 11866 ft: 15230 corp: 29/1034b lim: 50 exec/s: 50 rss: 69Mb L: 46/50 MS: 1 CMP- DE: "\000\203\226v\247:\030\272"- 00:08:19.614 [2024-11-08 04:51:54.673898] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.614 [2024-11-08 04:51:54.673925] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.614 [2024-11-08 04:51:54.673968] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.615 [2024-11-08 04:51:54.673984] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.615 [2024-11-08 04:51:54.674040] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.615 [2024-11-08 04:51:54.674056] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.615 #51 NEW cov: 11866 ft: 15248 corp: 30/1071b lim: 50 exec/s: 51 rss: 69Mb L: 37/50 MS: 1 EraseBytes- 00:08:19.615 [2024-11-08 04:51:54.713673] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.615 [2024-11-08 04:51:54.713700] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 #56 NEW cov: 11866 ft: 15268 corp: 31/1082b lim: 50 exec/s: 56 rss: 69Mb L: 11/50 MS: 5 CrossOver-CopyPart-EraseBytes-ChangeBinInt-CopyPart- 00:08:19.874 [2024-11-08 04:51:54.753869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.753896] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.753939] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.874 [2024-11-08 04:51:54.753954] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.874 #57 NEW cov: 11866 ft: 15293 corp: 32/1103b lim: 50 exec/s: 57 rss: 70Mb L: 21/50 MS: 1 PersAutoDict- DE: "\000\203\226v\247:\030\272"- 00:08:19.874 [2024-11-08 04:51:54.794366] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.794393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.794434] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.874 [2024-11-08 04:51:54.794449] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.794504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.874 [2024-11-08 04:51:54.794519] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.794579] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.874 [2024-11-08 04:51:54.794595] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.874 #58 NEW cov: 11866 ft: 15297 corp: 33/1149b lim: 50 exec/s: 58 rss: 70Mb L: 46/50 MS: 1 ChangeBit- 00:08:19.874 [2024-11-08 04:51:54.834173] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.834200] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.834245] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.874 [2024-11-08 04:51:54.834260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.874 #59 NEW cov: 11866 ft: 15306 corp: 34/1176b lim: 50 exec/s: 59 rss: 70Mb L: 27/50 MS: 1 CopyPart- 00:08:19.874 [2024-11-08 04:51:54.874533] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.874560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.874611] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.874 [2024-11-08 04:51:54.874626] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.874681] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.874 [2024-11-08 04:51:54.874696] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.874752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:19.874 [2024-11-08 04:51:54.874767] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:19.874 #60 NEW cov: 11866 ft: 15350 corp: 35/1222b lim: 50 exec/s: 60 rss: 70Mb L: 46/50 MS: 1 ChangeByte- 00:08:19.874 [2024-11-08 04:51:54.914215] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.914242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 #61 NEW cov: 11866 ft: 15381 corp: 36/1237b lim: 50 exec/s: 61 rss: 70Mb L: 15/50 MS: 1 CopyPart- 00:08:19.874 [2024-11-08 04:51:54.954623] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:19.874 [2024-11-08 04:51:54.954651] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.954690] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:19.874 [2024-11-08 04:51:54.954706] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:19.874 [2024-11-08 04:51:54.954761] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:19.874 [2024-11-08 04:51:54.954777] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:19.874 #62 NEW cov: 11866 ft: 15408 corp: 37/1274b lim: 50 exec/s: 62 rss: 70Mb L: 37/50 MS: 1 CrossOver- 00:08:20.134 [2024-11-08 04:51:54.994461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:54.994487] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 #63 NEW cov: 11866 ft: 15419 corp: 38/1284b lim: 50 exec/s: 63 rss: 70Mb L: 10/50 MS: 1 ChangeASCIIInt- 00:08:20.134 [2024-11-08 04:51:55.035028] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.035057] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.035096] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.035112] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.035166] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.134 [2024-11-08 04:51:55.035182] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.035239] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:20.134 [2024-11-08 04:51:55.035254] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.134 #64 NEW cov: 11866 ft: 15429 corp: 39/1330b lim: 50 exec/s: 64 rss: 70Mb L: 46/50 MS: 1 ChangeBit- 00:08:20.134 [2024-11-08 04:51:55.075000] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.075029] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.075087] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.075101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.075159] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.134 [2024-11-08 04:51:55.075175] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.115237] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.115264] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.115309] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.115324] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.115380] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.134 [2024-11-08 04:51:55.115395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.115451] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:20.134 [2024-11-08 04:51:55.115466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.134 #66 NEW cov: 11866 ft: 15446 corp: 40/1370b lim: 50 exec/s: 66 rss: 70Mb L: 40/50 MS: 2 InsertByte-CrossOver- 00:08:20.134 [2024-11-08 04:51:55.155055] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.155083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.155158] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.155174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 #67 NEW cov: 11866 ft: 15458 corp: 41/1399b lim: 50 exec/s: 67 rss: 70Mb L: 29/50 MS: 1 CMP- DE: "I\000\000\000\000\000\000\000"- 00:08:20.134 [2024-11-08 04:51:55.195294] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.195320] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.195376] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.195393] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.195450] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.134 [2024-11-08 04:51:55.195465] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.134 #68 NEW cov: 11866 ft: 15466 corp: 42/1431b lim: 50 exec/s: 68 rss: 70Mb L: 32/50 MS: 1 InsertRepeatedBytes- 00:08:20.134 [2024-11-08 04:51:55.235558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.134 [2024-11-08 04:51:55.235585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.235637] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:1 nsid:0 00:08:20.134 [2024-11-08 04:51:55.235653] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.235723] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:2 nsid:0 00:08:20.134 [2024-11-08 04:51:55.235739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:20.134 [2024-11-08 04:51:55.235796] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:3 nsid:0 00:08:20.134 [2024-11-08 04:51:55.235810] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:20.394 #69 NEW cov: 11866 ft: 15484 corp: 43/1471b lim: 50 exec/s: 69 rss: 70Mb L: 40/50 MS: 1 EraseBytes- 00:08:20.394 [2024-11-08 04:51:55.275265] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION RELEASE (15) sqid:1 cid:0 nsid:0 00:08:20.394 [2024-11-08 04:51:55.275292] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.394 #70 NEW cov: 11866 ft: 15498 corp: 44/1487b lim: 50 exec/s: 35 rss: 70Mb L: 16/50 MS: 1 ChangeBinInt- 00:08:20.394 #70 DONE cov: 11866 ft: 15498 corp: 44/1487b lim: 50 exec/s: 35 rss: 70Mb 00:08:20.394 ###### Recommended dictionary. ###### 00:08:20.394 "\001\000\000\000\000\000\000\000" # Uses: 1 00:08:20.394 "\377\377\377\377" # Uses: 1 00:08:20.394 "\000\203\226v\247:\030\272" # Uses: 1 00:08:20.394 "I\000\000\000\000\000\000\000" # Uses: 0 00:08:20.394 ###### End of recommended dictionary. ###### 00:08:20.394 Done 70 runs in 2 second(s) 00:08:20.394 04:51:55 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_21.conf 00:08:20.394 04:51:55 -- ../common.sh@72 -- # (( i++ )) 00:08:20.394 04:51:55 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:20.394 04:51:55 -- ../common.sh@73 -- # start_llvm_fuzz 22 1 0x1 00:08:20.394 04:51:55 -- nvmf/run.sh@23 -- # local fuzzer_type=22 00:08:20.394 04:51:55 -- nvmf/run.sh@24 -- # local timen=1 00:08:20.394 04:51:55 -- nvmf/run.sh@25 -- # local core=0x1 00:08:20.394 04:51:55 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.394 04:51:55 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_22.conf 00:08:20.394 04:51:55 -- nvmf/run.sh@29 -- # printf %02d 22 00:08:20.394 04:51:55 -- nvmf/run.sh@29 -- # port=4422 00:08:20.394 04:51:55 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.394 04:51:55 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' 00:08:20.394 04:51:55 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4422"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:20.394 04:51:55 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4422' -c /tmp/fuzz_json_22.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 -Z 22 -r /var/tmp/spdk22.sock 00:08:20.394 [2024-11-08 04:51:55.466822] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:20.394 [2024-11-08 04:51:55.466891] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3688760 ] 00:08:20.394 EAL: No free 2048 kB hugepages reported on node 1 00:08:20.653 [2024-11-08 04:51:55.644504] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.653 [2024-11-08 04:51:55.707795] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:20.653 [2024-11-08 04:51:55.707919] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.912 [2024-11-08 04:51:55.765916] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:20.912 [2024-11-08 04:51:55.782239] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4422 *** 00:08:20.912 INFO: Running with entropic power schedule (0xFF, 100). 00:08:20.912 INFO: Seed: 536166965 00:08:20.912 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:20.912 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:20.913 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_22 00:08:20.913 INFO: A corpus is not provided, starting from an empty corpus 00:08:20.913 #2 INITED exec/s: 0 rss: 60Mb 00:08:20.913 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:20.913 This may also happen if the target rejected all inputs we tried so far 00:08:20.913 [2024-11-08 04:51:55.831053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:20.913 [2024-11-08 04:51:55.831085] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:20.913 [2024-11-08 04:51:55.831138] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:20.913 [2024-11-08 04:51:55.831154] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:20.913 [2024-11-08 04:51:55.831205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:20.913 [2024-11-08 04:51:55.831220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.172 NEW_FUNC[1/672]: 0x461ed8 in fuzz_nvm_reservation_register_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:644 00:08:21.172 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:21.172 #5 NEW cov: 11665 ft: 11666 corp: 2/62b lim: 85 exec/s: 0 rss: 68Mb L: 61/61 MS: 3 ChangeByte-InsertByte-InsertRepeatedBytes- 00:08:21.172 [2024-11-08 04:51:56.151812] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.172 [2024-11-08 04:51:56.151845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.151895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.172 [2024-11-08 04:51:56.151910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.151963] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.172 [2024-11-08 04:51:56.151977] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.172 #6 NEW cov: 11778 ft: 12213 corp: 3/121b lim: 85 exec/s: 0 rss: 69Mb L: 59/61 MS: 1 EraseBytes- 00:08:21.172 [2024-11-08 04:51:56.201888] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.172 [2024-11-08 04:51:56.201916] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.201950] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.172 [2024-11-08 04:51:56.201965] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.202015] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.172 [2024-11-08 04:51:56.202030] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.172 #7 NEW cov: 11784 ft: 12333 corp: 4/182b lim: 85 exec/s: 0 rss: 69Mb L: 61/61 MS: 1 ChangeByte- 00:08:21.172 [2024-11-08 04:51:56.242141] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.172 [2024-11-08 04:51:56.242168] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.242205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.172 [2024-11-08 04:51:56.242220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.242273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.172 [2024-11-08 04:51:56.242289] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.172 [2024-11-08 04:51:56.242341] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:21.172 [2024-11-08 04:51:56.242355] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.172 #10 NEW cov: 11869 ft: 12859 corp: 5/266b lim: 85 exec/s: 0 rss: 69Mb L: 84/84 MS: 3 InsertByte-CrossOver-InsertRepeatedBytes- 00:08:21.432 [2024-11-08 04:51:56.282113] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.282141] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.282187] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.432 [2024-11-08 04:51:56.282203] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.282254] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.432 [2024-11-08 04:51:56.282269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.432 #11 NEW cov: 11869 ft: 12990 corp: 6/325b lim: 85 exec/s: 0 rss: 69Mb L: 59/84 MS: 1 ChangeBinInt- 00:08:21.432 [2024-11-08 04:51:56.322214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.322242] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.322293] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.432 [2024-11-08 04:51:56.322309] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.322360] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.432 [2024-11-08 04:51:56.322376] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.432 #12 NEW cov: 11869 ft: 13083 corp: 7/385b lim: 85 exec/s: 0 rss: 69Mb L: 60/84 MS: 1 InsertByte- 00:08:21.432 [2024-11-08 04:51:56.362073] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.362100] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 #13 NEW cov: 11869 ft: 13962 corp: 8/411b lim: 85 exec/s: 0 rss: 69Mb L: 26/84 MS: 1 CrossOver- 00:08:21.432 [2024-11-08 04:51:56.402452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.402478] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.402535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.432 [2024-11-08 04:51:56.402550] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.402603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.432 [2024-11-08 04:51:56.402617] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.432 #15 NEW cov: 11869 ft: 14030 corp: 9/473b lim: 85 exec/s: 0 rss: 69Mb L: 62/84 MS: 2 InsertByte-CrossOver- 00:08:21.432 [2024-11-08 04:51:56.432555] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.432581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.432624] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.432 [2024-11-08 04:51:56.432638] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.432689] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.432 [2024-11-08 04:51:56.432704] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.432 #16 NEW cov: 11869 ft: 14067 corp: 10/534b lim: 85 exec/s: 0 rss: 69Mb L: 61/84 MS: 1 ChangeBit- 00:08:21.432 [2024-11-08 04:51:56.472658] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.432 [2024-11-08 04:51:56.472685] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.472722] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.432 [2024-11-08 04:51:56.472737] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.432 [2024-11-08 04:51:56.472788] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.433 [2024-11-08 04:51:56.472803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.433 #17 NEW cov: 11869 ft: 14106 corp: 11/595b lim: 85 exec/s: 0 rss: 69Mb L: 61/84 MS: 1 CrossOver- 00:08:21.433 [2024-11-08 04:51:56.512803] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.433 [2024-11-08 04:51:56.512829] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.433 [2024-11-08 04:51:56.512869] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.433 [2024-11-08 04:51:56.512884] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.433 [2024-11-08 04:51:56.512935] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.433 [2024-11-08 04:51:56.512948] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.433 #18 NEW cov: 11869 ft: 14149 corp: 12/656b lim: 85 exec/s: 0 rss: 69Mb L: 61/84 MS: 1 InsertByte- 00:08:21.692 [2024-11-08 04:51:56.552918] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.692 [2024-11-08 04:51:56.552943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.692 [2024-11-08 04:51:56.552979] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.692 [2024-11-08 04:51:56.552994] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.692 [2024-11-08 04:51:56.553046] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.692 [2024-11-08 04:51:56.553061] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.692 #19 NEW cov: 11869 ft: 14158 corp: 13/718b lim: 85 exec/s: 0 rss: 69Mb L: 62/84 MS: 1 InsertByte- 00:08:21.692 [2024-11-08 04:51:56.592851] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.692 [2024-11-08 04:51:56.592877] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.692 [2024-11-08 04:51:56.592930] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.692 [2024-11-08 04:51:56.592949] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.692 #20 NEW cov: 11869 ft: 14578 corp: 14/755b lim: 85 exec/s: 0 rss: 69Mb L: 37/84 MS: 1 EraseBytes- 00:08:21.692 [2024-11-08 04:51:56.633252] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.692 [2024-11-08 04:51:56.633279] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.692 [2024-11-08 04:51:56.633316] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.692 [2024-11-08 04:51:56.633330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.692 [2024-11-08 04:51:56.633381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.692 [2024-11-08 04:51:56.633396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.633445] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:21.693 [2024-11-08 04:51:56.633460] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.693 #21 NEW cov: 11869 ft: 14617 corp: 15/835b lim: 85 exec/s: 0 rss: 69Mb L: 80/84 MS: 1 CopyPart- 00:08:21.693 [2024-11-08 04:51:56.673267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.693 [2024-11-08 04:51:56.673294] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.673328] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.693 [2024-11-08 04:51:56.673343] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.673411] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.693 [2024-11-08 04:51:56.673426] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.693 #22 NEW cov: 11869 ft: 14640 corp: 16/897b lim: 85 exec/s: 0 rss: 69Mb L: 62/84 MS: 1 InsertByte- 00:08:21.693 [2024-11-08 04:51:56.713319] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.693 [2024-11-08 04:51:56.713346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.713381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.693 [2024-11-08 04:51:56.713395] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.713446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.693 [2024-11-08 04:51:56.713461] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.693 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:21.693 #23 NEW cov: 11892 ft: 14681 corp: 17/959b lim: 85 exec/s: 0 rss: 70Mb L: 62/84 MS: 1 ChangeBinInt- 00:08:21.693 [2024-11-08 04:51:56.753622] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.693 [2024-11-08 04:51:56.753649] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.753694] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.693 [2024-11-08 04:51:56.753712] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.753762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.693 [2024-11-08 04:51:56.753776] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.753824] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:21.693 [2024-11-08 04:51:56.753839] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.693 #24 NEW cov: 11892 ft: 14737 corp: 18/1038b lim: 85 exec/s: 0 rss: 70Mb L: 79/84 MS: 1 CrossOver- 00:08:21.693 [2024-11-08 04:51:56.793467] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.693 [2024-11-08 04:51:56.793494] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.693 [2024-11-08 04:51:56.793565] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.693 [2024-11-08 04:51:56.793581] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.952 #25 NEW cov: 11892 ft: 14760 corp: 19/1075b lim: 85 exec/s: 25 rss: 70Mb L: 37/84 MS: 1 ShuffleBytes- 00:08:21.952 [2024-11-08 04:51:56.833617] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.952 [2024-11-08 04:51:56.833644] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.952 [2024-11-08 04:51:56.833698] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.952 [2024-11-08 04:51:56.833713] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.952 #26 NEW cov: 11892 ft: 14785 corp: 20/1112b lim: 85 exec/s: 26 rss: 70Mb L: 37/84 MS: 1 ChangeByte- 00:08:21.952 [2024-11-08 04:51:56.873547] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.953 [2024-11-08 04:51:56.873573] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.953 #27 NEW cov: 11892 ft: 14804 corp: 21/1137b lim: 85 exec/s: 27 rss: 70Mb L: 25/84 MS: 1 EraseBytes- 00:08:21.953 [2024-11-08 04:51:56.913954] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.953 [2024-11-08 04:51:56.913980] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.914016] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.953 [2024-11-08 04:51:56.914031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.914083] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.953 [2024-11-08 04:51:56.914098] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.953 #28 NEW cov: 11892 ft: 14838 corp: 22/1200b lim: 85 exec/s: 28 rss: 70Mb L: 63/84 MS: 1 InsertByte- 00:08:21.953 [2024-11-08 04:51:56.954218] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.953 [2024-11-08 04:51:56.954246] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.954284] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.953 [2024-11-08 04:51:56.954299] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.954352] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.953 [2024-11-08 04:51:56.954368] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.954419] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:21.953 [2024-11-08 04:51:56.954435] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:21.953 #29 NEW cov: 11892 ft: 14887 corp: 23/1273b lim: 85 exec/s: 29 rss: 70Mb L: 73/84 MS: 1 CopyPart- 00:08:21.953 [2024-11-08 04:51:56.994180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.953 [2024-11-08 04:51:56.994207] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.994246] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.953 [2024-11-08 04:51:56.994260] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:56.994312] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.953 [2024-11-08 04:51:56.994327] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.953 #30 NEW cov: 11892 ft: 14906 corp: 24/1333b lim: 85 exec/s: 30 rss: 70Mb L: 60/84 MS: 1 CrossOver- 00:08:21.953 [2024-11-08 04:51:57.034273] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:21.953 [2024-11-08 04:51:57.034300] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:57.034336] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:21.953 [2024-11-08 04:51:57.034352] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:21.953 [2024-11-08 04:51:57.034402] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:21.953 [2024-11-08 04:51:57.034417] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:21.953 #31 NEW cov: 11892 ft: 14956 corp: 25/1393b lim: 85 exec/s: 31 rss: 70Mb L: 60/84 MS: 1 InsertByte- 00:08:22.220 [2024-11-08 04:51:57.074405] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.220 [2024-11-08 04:51:57.074432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.074487] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.220 [2024-11-08 04:51:57.074503] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.074562] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.220 [2024-11-08 04:51:57.074577] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.220 #32 NEW cov: 11892 ft: 14975 corp: 26/1453b lim: 85 exec/s: 32 rss: 70Mb L: 60/84 MS: 1 ChangeBit- 00:08:22.220 [2024-11-08 04:51:57.114635] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.220 [2024-11-08 04:51:57.114661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.114702] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.220 [2024-11-08 04:51:57.114717] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.114766] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.220 [2024-11-08 04:51:57.114780] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.114830] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.220 [2024-11-08 04:51:57.114844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.220 #33 NEW cov: 11892 ft: 14993 corp: 27/1529b lim: 85 exec/s: 33 rss: 70Mb L: 76/84 MS: 1 InsertRepeatedBytes- 00:08:22.220 [2024-11-08 04:51:57.154759] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.220 [2024-11-08 04:51:57.154785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.154831] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.220 [2024-11-08 04:51:57.154847] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.154895] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.220 [2024-11-08 04:51:57.154910] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.154961] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.220 [2024-11-08 04:51:57.154975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.220 #34 NEW cov: 11892 ft: 15000 corp: 28/1610b lim: 85 exec/s: 34 rss: 70Mb L: 81/84 MS: 1 InsertRepeatedBytes- 00:08:22.220 [2024-11-08 04:51:57.194900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.220 [2024-11-08 04:51:57.194926] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.194972] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.220 [2024-11-08 04:51:57.194987] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.195037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.220 [2024-11-08 04:51:57.195051] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.195102] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.220 [2024-11-08 04:51:57.195117] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.234981] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.220 [2024-11-08 04:51:57.235007] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.235053] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.220 [2024-11-08 04:51:57.235068] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.220 [2024-11-08 04:51:57.235137] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.221 [2024-11-08 04:51:57.235155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.221 [2024-11-08 04:51:57.235205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.221 [2024-11-08 04:51:57.235220] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.221 #36 NEW cov: 11892 ft: 15028 corp: 29/1690b lim: 85 exec/s: 36 rss: 70Mb L: 80/84 MS: 2 ChangeByte-CrossOver- 00:08:22.221 [2024-11-08 04:51:57.274790] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.221 [2024-11-08 04:51:57.274816] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.221 [2024-11-08 04:51:57.274857] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.221 [2024-11-08 04:51:57.274873] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.221 #37 NEW cov: 11892 ft: 15061 corp: 30/1725b lim: 85 exec/s: 37 rss: 70Mb L: 35/84 MS: 1 EraseBytes- 00:08:22.221 [2024-11-08 04:51:57.315203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.221 [2024-11-08 04:51:57.315230] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.221 [2024-11-08 04:51:57.315267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.221 [2024-11-08 04:51:57.315280] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.221 [2024-11-08 04:51:57.315330] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.221 [2024-11-08 04:51:57.315344] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.221 [2024-11-08 04:51:57.315394] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:3 nsid:0 00:08:22.221 [2024-11-08 04:51:57.315409] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:22.481 #38 NEW cov: 11892 ft: 15111 corp: 31/1809b lim: 85 exec/s: 38 rss: 70Mb L: 84/84 MS: 1 ShuffleBytes- 00:08:22.481 [2024-11-08 04:51:57.355194] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.355219] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.355255] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.481 [2024-11-08 04:51:57.355267] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.355318] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.481 [2024-11-08 04:51:57.355333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.481 #39 NEW cov: 11892 ft: 15158 corp: 32/1869b lim: 85 exec/s: 39 rss: 70Mb L: 60/84 MS: 1 ChangeBit- 00:08:22.481 [2024-11-08 04:51:57.395267] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.395293] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.395332] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.481 [2024-11-08 04:51:57.395350] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.395399] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.481 [2024-11-08 04:51:57.395413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.481 #40 NEW cov: 11892 ft: 15224 corp: 33/1929b lim: 85 exec/s: 40 rss: 70Mb L: 60/84 MS: 1 ChangeBit- 00:08:22.481 [2024-11-08 04:51:57.435391] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.435416] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.435452] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.481 [2024-11-08 04:51:57.435467] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.435520] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.481 [2024-11-08 04:51:57.435540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.465203] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.465229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 #42 NEW cov: 11892 ft: 15250 corp: 34/1961b lim: 85 exec/s: 42 rss: 70Mb L: 32/84 MS: 2 ChangeBinInt-EraseBytes- 00:08:22.481 [2024-11-08 04:51:57.505584] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.505610] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.505646] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.481 [2024-11-08 04:51:57.505661] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.505711] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.481 [2024-11-08 04:51:57.505741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.481 #43 NEW cov: 11892 ft: 15259 corp: 35/2024b lim: 85 exec/s: 43 rss: 70Mb L: 63/84 MS: 1 ShuffleBytes- 00:08:22.481 [2024-11-08 04:51:57.545724] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.545750] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.545805] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.481 [2024-11-08 04:51:57.545820] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.481 [2024-11-08 04:51:57.545870] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.481 [2024-11-08 04:51:57.545885] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.481 #44 NEW cov: 11892 ft: 15296 corp: 36/2084b lim: 85 exec/s: 44 rss: 70Mb L: 60/84 MS: 1 CMP- DE: "\004\000\000\000\000\000\000\000"- 00:08:22.481 [2024-11-08 04:51:57.585644] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.481 [2024-11-08 04:51:57.585671] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 #45 NEW cov: 11892 ft: 15301 corp: 37/2109b lim: 85 exec/s: 45 rss: 70Mb L: 25/84 MS: 1 PersAutoDict- DE: "\004\000\000\000\000\000\000\000"- 00:08:22.741 [2024-11-08 04:51:57.625844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.625870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.625922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.625938] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 #46 NEW cov: 11892 ft: 15326 corp: 38/2153b lim: 85 exec/s: 46 rss: 70Mb L: 44/84 MS: 1 EraseBytes- 00:08:22.741 [2024-11-08 04:51:57.665941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.665968] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.666019] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.666034] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.706219] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.706244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.706281] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.706296] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.706345] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.741 [2024-11-08 04:51:57.706360] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.736285] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.736313] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.736349] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.736365] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.736415] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.741 [2024-11-08 04:51:57.736429] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.741 #53 NEW cov: 11892 ft: 15331 corp: 39/2216b lim: 85 exec/s: 53 rss: 70Mb L: 63/84 MS: 2 CrossOver-ChangeBinInt- 00:08:22.741 [2024-11-08 04:51:57.766378] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.766404] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.766461] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.766477] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.766534] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.741 [2024-11-08 04:51:57.766552] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.741 #54 NEW cov: 11892 ft: 15334 corp: 40/2277b lim: 85 exec/s: 54 rss: 70Mb L: 61/84 MS: 1 ShuffleBytes- 00:08:22.741 [2024-11-08 04:51:57.796466] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:0 nsid:0 00:08:22.741 [2024-11-08 04:51:57.796493] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.796538] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:1 nsid:0 00:08:22.741 [2024-11-08 04:51:57.796554] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:22.741 [2024-11-08 04:51:57.796603] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REGISTER (0d) sqid:1 cid:2 nsid:0 00:08:22.741 [2024-11-08 04:51:57.796618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:22.741 #55 NEW cov: 11892 ft: 15353 corp: 41/2343b lim: 85 exec/s: 27 rss: 70Mb L: 66/84 MS: 1 CMP- DE: "\014\000\000\000"- 00:08:22.741 #55 DONE cov: 11892 ft: 15353 corp: 41/2343b lim: 85 exec/s: 27 rss: 70Mb 00:08:22.741 ###### Recommended dictionary. ###### 00:08:22.741 "\004\000\000\000\000\000\000\000" # Uses: 1 00:08:22.741 "\014\000\000\000" # Uses: 0 00:08:22.741 ###### End of recommended dictionary. ###### 00:08:22.741 Done 55 runs in 2 second(s) 00:08:23.000 04:51:57 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_22.conf 00:08:23.000 04:51:57 -- ../common.sh@72 -- # (( i++ )) 00:08:23.000 04:51:57 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:23.000 04:51:57 -- ../common.sh@73 -- # start_llvm_fuzz 23 1 0x1 00:08:23.000 04:51:57 -- nvmf/run.sh@23 -- # local fuzzer_type=23 00:08:23.000 04:51:57 -- nvmf/run.sh@24 -- # local timen=1 00:08:23.000 04:51:57 -- nvmf/run.sh@25 -- # local core=0x1 00:08:23.000 04:51:57 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.000 04:51:57 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_23.conf 00:08:23.000 04:51:57 -- nvmf/run.sh@29 -- # printf %02d 23 00:08:23.000 04:51:57 -- nvmf/run.sh@29 -- # port=4423 00:08:23.000 04:51:57 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.000 04:51:57 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' 00:08:23.000 04:51:57 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4423"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:23.000 04:51:57 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4423' -c /tmp/fuzz_json_23.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 -Z 23 -r /var/tmp/spdk23.sock 00:08:23.000 [2024-11-08 04:51:57.980213] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:23.001 [2024-11-08 04:51:57.980298] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3689154 ] 00:08:23.001 EAL: No free 2048 kB hugepages reported on node 1 00:08:23.260 [2024-11-08 04:51:58.167873] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.260 [2024-11-08 04:51:58.234629] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:23.260 [2024-11-08 04:51:58.234769] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.260 [2024-11-08 04:51:58.293051] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:23.260 [2024-11-08 04:51:58.309362] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4423 *** 00:08:23.260 INFO: Running with entropic power schedule (0xFF, 100). 00:08:23.260 INFO: Seed: 3064166025 00:08:23.260 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:23.260 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:23.260 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_23 00:08:23.260 INFO: A corpus is not provided, starting from an empty corpus 00:08:23.260 #2 INITED exec/s: 0 rss: 60Mb 00:08:23.260 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:23.260 This may also happen if the target rejected all inputs we tried so far 00:08:23.260 [2024-11-08 04:51:58.354356] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.260 [2024-11-08 04:51:58.354386] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 NEW_FUNC[1/671]: 0x465118 in fuzz_nvm_reservation_report_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:671 00:08:23.778 NEW_FUNC[2/671]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:23.778 #7 NEW cov: 11598 ft: 11599 corp: 2/6b lim: 25 exec/s: 0 rss: 68Mb L: 5/5 MS: 5 CrossOver-CopyPart-CrossOver-CopyPart-InsertByte- 00:08:23.778 [2024-11-08 04:51:58.665180] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.665212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 #9 NEW cov: 11711 ft: 12050 corp: 3/12b lim: 25 exec/s: 0 rss: 68Mb L: 6/6 MS: 2 CopyPart-CrossOver- 00:08:23.778 [2024-11-08 04:51:58.705205] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.705234] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 #11 NEW cov: 11717 ft: 12367 corp: 4/21b lim: 25 exec/s: 0 rss: 68Mb L: 9/9 MS: 2 EraseBytes-CrossOver- 00:08:23.778 [2024-11-08 04:51:58.745443] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.745470] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 [2024-11-08 04:51:58.745521] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:23.778 [2024-11-08 04:51:58.745542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.778 #15 NEW cov: 11802 ft: 13000 corp: 5/33b lim: 25 exec/s: 0 rss: 68Mb L: 12/12 MS: 4 CopyPart-CrossOver-CrossOver-CMP- DE: "\377\202\226~\242\232\203\022"- 00:08:23.778 [2024-11-08 04:51:58.785426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.785452] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 #16 NEW cov: 11802 ft: 13244 corp: 6/39b lim: 25 exec/s: 0 rss: 68Mb L: 6/12 MS: 1 ChangeBinInt- 00:08:23.778 [2024-11-08 04:51:58.825880] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.825907] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 [2024-11-08 04:51:58.825956] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:23.778 [2024-11-08 04:51:58.825971] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:23.778 [2024-11-08 04:51:58.826026] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:23.778 [2024-11-08 04:51:58.826040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:23.778 [2024-11-08 04:51:58.826099] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:23.778 [2024-11-08 04:51:58.826115] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:23.778 #17 NEW cov: 11802 ft: 13772 corp: 7/59b lim: 25 exec/s: 0 rss: 68Mb L: 20/20 MS: 1 CrossOver- 00:08:23.778 [2024-11-08 04:51:58.865731] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:23.778 [2024-11-08 04:51:58.865757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:23.778 [2024-11-08 04:51:58.865798] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:23.778 [2024-11-08 04:51:58.865814] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.037 #18 NEW cov: 11802 ft: 13835 corp: 8/73b lim: 25 exec/s: 0 rss: 68Mb L: 14/20 MS: 1 PersAutoDict- DE: "\377\202\226~\242\232\203\022"- 00:08:24.037 [2024-11-08 04:51:58.905739] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.037 [2024-11-08 04:51:58.905765] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 #19 NEW cov: 11802 ft: 13865 corp: 9/78b lim: 25 exec/s: 0 rss: 68Mb L: 5/20 MS: 1 ChangeBinInt- 00:08:24.038 [2024-11-08 04:51:58.946253] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:58.946278] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 [2024-11-08 04:51:58.946327] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.038 [2024-11-08 04:51:58.946346] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.038 [2024-11-08 04:51:58.946413] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.038 [2024-11-08 04:51:58.946428] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.038 [2024-11-08 04:51:58.946484] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.038 [2024-11-08 04:51:58.946499] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.038 #20 NEW cov: 11802 ft: 13909 corp: 10/99b lim: 25 exec/s: 0 rss: 68Mb L: 21/21 MS: 1 InsertRepeatedBytes- 00:08:24.038 [2024-11-08 04:51:58.985974] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:58.986000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 #26 NEW cov: 11802 ft: 13955 corp: 11/105b lim: 25 exec/s: 0 rss: 68Mb L: 6/21 MS: 1 ChangeByte- 00:08:24.038 [2024-11-08 04:51:59.026135] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:59.026162] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 #27 NEW cov: 11802 ft: 14024 corp: 12/112b lim: 25 exec/s: 0 rss: 68Mb L: 7/21 MS: 1 InsertByte- 00:08:24.038 [2024-11-08 04:51:59.066220] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:59.066247] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 #28 NEW cov: 11802 ft: 14051 corp: 13/118b lim: 25 exec/s: 0 rss: 68Mb L: 6/21 MS: 1 ChangeBinInt- 00:08:24.038 [2024-11-08 04:51:59.106404] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:59.106433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 #29 NEW cov: 11802 ft: 14068 corp: 14/125b lim: 25 exec/s: 0 rss: 68Mb L: 7/21 MS: 1 ChangeBinInt- 00:08:24.038 [2024-11-08 04:51:59.146776] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.038 [2024-11-08 04:51:59.146803] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.038 [2024-11-08 04:51:59.146844] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.038 [2024-11-08 04:51:59.146861] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.297 [2024-11-08 04:51:59.146916] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.297 [2024-11-08 04:51:59.146933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.297 #30 NEW cov: 11802 ft: 14359 corp: 15/143b lim: 25 exec/s: 0 rss: 69Mb L: 18/21 MS: 1 InsertRepeatedBytes- 00:08:24.297 [2024-11-08 04:51:59.186706] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.297 [2024-11-08 04:51:59.186733] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.297 [2024-11-08 04:51:59.186786] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.297 [2024-11-08 04:51:59.186809] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.297 #31 NEW cov: 11802 ft: 14371 corp: 16/157b lim: 25 exec/s: 0 rss: 69Mb L: 14/21 MS: 1 CopyPart- 00:08:24.297 [2024-11-08 04:51:59.226855] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.297 [2024-11-08 04:51:59.226881] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.297 [2024-11-08 04:51:59.226941] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.297 [2024-11-08 04:51:59.226956] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.297 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:24.297 #32 NEW cov: 11825 ft: 14443 corp: 17/171b lim: 25 exec/s: 0 rss: 69Mb L: 14/21 MS: 1 CMP- DE: "\377\377\377\377\001 \307\343"- 00:08:24.297 [2024-11-08 04:51:59.277236] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.297 [2024-11-08 04:51:59.277263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.297 [2024-11-08 04:51:59.277308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.297 [2024-11-08 04:51:59.277330] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.297 [2024-11-08 04:51:59.277381] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.298 [2024-11-08 04:51:59.277396] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.298 [2024-11-08 04:51:59.277449] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.298 [2024-11-08 04:51:59.277464] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.298 #33 NEW cov: 11825 ft: 14469 corp: 18/194b lim: 25 exec/s: 0 rss: 69Mb L: 23/23 MS: 1 CopyPart- 00:08:24.298 [2024-11-08 04:51:59.317358] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.298 [2024-11-08 04:51:59.317388] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.298 [2024-11-08 04:51:59.317426] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.298 [2024-11-08 04:51:59.317441] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.298 [2024-11-08 04:51:59.317496] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.298 [2024-11-08 04:51:59.317511] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.298 [2024-11-08 04:51:59.317587] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.298 [2024-11-08 04:51:59.317604] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.298 #34 NEW cov: 11825 ft: 14491 corp: 19/215b lim: 25 exec/s: 34 rss: 69Mb L: 21/23 MS: 1 InsertRepeatedBytes- 00:08:24.298 [2024-11-08 04:51:59.357214] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.298 [2024-11-08 04:51:59.357241] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.298 [2024-11-08 04:51:59.357296] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.298 [2024-11-08 04:51:59.357311] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.298 #35 NEW cov: 11825 ft: 14557 corp: 20/227b lim: 25 exec/s: 35 rss: 69Mb L: 12/23 MS: 1 CopyPart- 00:08:24.298 [2024-11-08 04:51:59.397201] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.298 [2024-11-08 04:51:59.397228] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.557 #37 NEW cov: 11825 ft: 14591 corp: 21/236b lim: 25 exec/s: 37 rss: 69Mb L: 9/23 MS: 2 ShuffleBytes-PersAutoDict- DE: "\377\202\226~\242\232\203\022"- 00:08:24.557 [2024-11-08 04:51:59.427559] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.557 [2024-11-08 04:51:59.427585] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.557 [2024-11-08 04:51:59.427633] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.557 [2024-11-08 04:51:59.427648] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.557 [2024-11-08 04:51:59.427710] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.557 [2024-11-08 04:51:59.427725] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.557 #38 NEW cov: 11825 ft: 14597 corp: 22/251b lim: 25 exec/s: 38 rss: 69Mb L: 15/23 MS: 1 EraseBytes- 00:08:24.557 [2024-11-08 04:51:59.467758] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.557 [2024-11-08 04:51:59.467785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.557 [2024-11-08 04:51:59.467852] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.557 [2024-11-08 04:51:59.467868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.557 [2024-11-08 04:51:59.467922] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.557 [2024-11-08 04:51:59.467940] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.557 [2024-11-08 04:51:59.467995] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.557 [2024-11-08 04:51:59.468011] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.557 #39 NEW cov: 11825 ft: 14683 corp: 23/274b lim: 25 exec/s: 39 rss: 69Mb L: 23/23 MS: 1 CMP- DE: "\000\000\000\000\000\000\000\000"- 00:08:24.557 [2024-11-08 04:51:59.507509] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.557 [2024-11-08 04:51:59.507541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.557 #40 NEW cov: 11825 ft: 14703 corp: 24/282b lim: 25 exec/s: 40 rss: 69Mb L: 8/23 MS: 1 EraseBytes- 00:08:24.558 [2024-11-08 04:51:59.547752] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.558 [2024-11-08 04:51:59.547778] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.558 [2024-11-08 04:51:59.547826] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.558 [2024-11-08 04:51:59.547846] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.558 #41 NEW cov: 11825 ft: 14733 corp: 25/296b lim: 25 exec/s: 41 rss: 70Mb L: 14/23 MS: 1 CopyPart- 00:08:24.558 [2024-11-08 04:51:59.588011] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.558 [2024-11-08 04:51:59.588038] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.558 [2024-11-08 04:51:59.588080] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.558 [2024-11-08 04:51:59.588095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.558 [2024-11-08 04:51:59.588150] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.558 [2024-11-08 04:51:59.588165] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.558 #42 NEW cov: 11825 ft: 14746 corp: 26/311b lim: 25 exec/s: 42 rss: 70Mb L: 15/23 MS: 1 InsertRepeatedBytes- 00:08:24.558 [2024-11-08 04:51:59.628075] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.558 [2024-11-08 04:51:59.628101] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.558 [2024-11-08 04:51:59.628154] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.558 [2024-11-08 04:51:59.628174] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.558 [2024-11-08 04:51:59.628228] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.558 [2024-11-08 04:51:59.628244] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.558 #43 NEW cov: 11825 ft: 14764 corp: 27/326b lim: 25 exec/s: 43 rss: 70Mb L: 15/23 MS: 1 InsertByte- 00:08:24.817 [2024-11-08 04:51:59.668346] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.668373] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.668423] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.817 [2024-11-08 04:51:59.668445] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.668504] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.817 [2024-11-08 04:51:59.668520] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.668581] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.817 [2024-11-08 04:51:59.668596] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.817 #44 NEW cov: 11825 ft: 14796 corp: 28/347b lim: 25 exec/s: 44 rss: 70Mb L: 21/23 MS: 1 ChangeBit- 00:08:24.817 [2024-11-08 04:51:59.708442] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.708469] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.708543] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.817 [2024-11-08 04:51:59.708560] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.708613] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.817 [2024-11-08 04:51:59.708628] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.708693] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.817 [2024-11-08 04:51:59.708708] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.817 #45 NEW cov: 11825 ft: 14805 corp: 29/369b lim: 25 exec/s: 45 rss: 70Mb L: 22/23 MS: 1 InsertByte- 00:08:24.817 [2024-11-08 04:51:59.748308] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.748335] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.748370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.817 [2024-11-08 04:51:59.748385] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.817 #46 NEW cov: 11825 ft: 14812 corp: 30/381b lim: 25 exec/s: 46 rss: 70Mb L: 12/23 MS: 1 ChangeBinInt- 00:08:24.817 [2024-11-08 04:51:59.788242] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.788269] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 #47 NEW cov: 11825 ft: 14846 corp: 31/389b lim: 25 exec/s: 47 rss: 70Mb L: 8/23 MS: 1 ChangeBinInt- 00:08:24.817 [2024-11-08 04:51:59.828819] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.828845] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.828900] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:24.817 [2024-11-08 04:51:59.828914] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.828966] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:24.817 [2024-11-08 04:51:59.828982] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:24.817 [2024-11-08 04:51:59.829037] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:24.817 [2024-11-08 04:51:59.829054] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:24.817 #48 NEW cov: 11825 ft: 14848 corp: 32/413b lim: 25 exec/s: 48 rss: 70Mb L: 24/24 MS: 1 CrossOver- 00:08:24.817 [2024-11-08 04:51:59.868535] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.868561] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:24.817 #49 NEW cov: 11825 ft: 14859 corp: 33/418b lim: 25 exec/s: 49 rss: 70Mb L: 5/24 MS: 1 CrossOver- 00:08:24.817 [2024-11-08 04:51:59.908594] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:24.817 [2024-11-08 04:51:59.908620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 #50 NEW cov: 11825 ft: 14898 corp: 34/423b lim: 25 exec/s: 50 rss: 70Mb L: 5/24 MS: 1 ShuffleBytes- 00:08:25.076 [2024-11-08 04:51:59.948853] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.076 [2024-11-08 04:51:59.948879] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:51:59.948938] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.076 [2024-11-08 04:51:59.948953] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.076 #51 NEW cov: 11825 ft: 14922 corp: 35/437b lim: 25 exec/s: 51 rss: 70Mb L: 14/24 MS: 1 ChangeByte- 00:08:25.076 [2024-11-08 04:51:59.989010] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.076 [2024-11-08 04:51:59.989036] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:51:59.989091] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.076 [2024-11-08 04:51:59.989105] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.076 #52 NEW cov: 11825 ft: 14995 corp: 36/450b lim: 25 exec/s: 52 rss: 70Mb L: 13/24 MS: 1 InsertByte- 00:08:25.076 [2024-11-08 04:52:00.029370] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.076 [2024-11-08 04:52:00.029398] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.029446] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.076 [2024-11-08 04:52:00.029466] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.029527] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.076 [2024-11-08 04:52:00.029542] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.029599] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.076 [2024-11-08 04:52:00.029614] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.076 #53 NEW cov: 11825 ft: 15015 corp: 37/472b lim: 25 exec/s: 53 rss: 70Mb L: 22/24 MS: 1 InsertRepeatedBytes- 00:08:25.076 [2024-11-08 04:52:00.079428] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.076 [2024-11-08 04:52:00.079458] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.079517] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.076 [2024-11-08 04:52:00.079541] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.079601] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.076 [2024-11-08 04:52:00.079620] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.076 #54 NEW cov: 11825 ft: 15049 corp: 38/487b lim: 25 exec/s: 54 rss: 70Mb L: 15/24 MS: 1 PersAutoDict- DE: "\000\000\000\000\000\000\000\000"- 00:08:25.076 [2024-11-08 04:52:00.119512] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.076 [2024-11-08 04:52:00.119546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.076 [2024-11-08 04:52:00.119585] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.077 [2024-11-08 04:52:00.119601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.077 [2024-11-08 04:52:00.119657] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.077 [2024-11-08 04:52:00.119672] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.077 #55 NEW cov: 11825 ft: 15088 corp: 39/506b lim: 25 exec/s: 55 rss: 70Mb L: 19/24 MS: 1 InsertRepeatedBytes- 00:08:25.077 [2024-11-08 04:52:00.159800] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.077 [2024-11-08 04:52:00.159827] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.077 [2024-11-08 04:52:00.159871] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.077 [2024-11-08 04:52:00.159895] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.077 [2024-11-08 04:52:00.159948] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:2 nsid:0 00:08:25.077 [2024-11-08 04:52:00.159963] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:25.077 [2024-11-08 04:52:00.160017] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:3 nsid:0 00:08:25.077 [2024-11-08 04:52:00.160033] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:25.336 #56 NEW cov: 11825 ft: 15105 corp: 40/527b lim: 25 exec/s: 56 rss: 70Mb L: 21/24 MS: 1 EraseBytes- 00:08:25.336 [2024-11-08 04:52:00.209558] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.336 [2024-11-08 04:52:00.209588] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.336 #58 NEW cov: 11825 ft: 15121 corp: 41/536b lim: 25 exec/s: 58 rss: 70Mb L: 9/24 MS: 2 CrossOver-PersAutoDict- DE: "\377\377\377\377\001 \307\343"- 00:08:25.336 [2024-11-08 04:52:00.249762] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.336 [2024-11-08 04:52:00.249789] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.336 [2024-11-08 04:52:00.249829] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:1 nsid:0 00:08:25.336 [2024-11-08 04:52:00.249844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:25.336 #59 NEW cov: 11825 ft: 15128 corp: 42/547b lim: 25 exec/s: 59 rss: 70Mb L: 11/24 MS: 1 CMP- DE: "\377\377"- 00:08:25.336 [2024-11-08 04:52:00.299874] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.336 [2024-11-08 04:52:00.299901] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.336 #60 NEW cov: 11825 ft: 15135 corp: 43/556b lim: 25 exec/s: 60 rss: 70Mb L: 9/24 MS: 1 ShuffleBytes- 00:08:25.336 [2024-11-08 04:52:00.339944] nvme_qpair.c: 256:nvme_io_qpair_print_command: *NOTICE*: RESERVATION REPORT (0e) sqid:1 cid:0 nsid:0 00:08:25.336 [2024-11-08 04:52:00.339974] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.336 #61 NEW cov: 11825 ft: 15138 corp: 44/564b lim: 25 exec/s: 30 rss: 70Mb L: 8/24 MS: 1 CopyPart- 00:08:25.336 #61 DONE cov: 11825 ft: 15138 corp: 44/564b lim: 25 exec/s: 30 rss: 70Mb 00:08:25.336 ###### Recommended dictionary. ###### 00:08:25.336 "\377\202\226~\242\232\203\022" # Uses: 2 00:08:25.336 "\377\377\377\377\001 \307\343" # Uses: 1 00:08:25.336 "\000\000\000\000\000\000\000\000" # Uses: 1 00:08:25.336 "\377\377" # Uses: 0 00:08:25.336 ###### End of recommended dictionary. ###### 00:08:25.336 Done 61 runs in 2 second(s) 00:08:25.596 04:52:00 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_23.conf 00:08:25.596 04:52:00 -- ../common.sh@72 -- # (( i++ )) 00:08:25.596 04:52:00 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:25.596 04:52:00 -- ../common.sh@73 -- # start_llvm_fuzz 24 1 0x1 00:08:25.596 04:52:00 -- nvmf/run.sh@23 -- # local fuzzer_type=24 00:08:25.596 04:52:00 -- nvmf/run.sh@24 -- # local timen=1 00:08:25.596 04:52:00 -- nvmf/run.sh@25 -- # local core=0x1 00:08:25.596 04:52:00 -- nvmf/run.sh@26 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:25.596 04:52:00 -- nvmf/run.sh@27 -- # local nvmf_cfg=/tmp/fuzz_json_24.conf 00:08:25.596 04:52:00 -- nvmf/run.sh@29 -- # printf %02d 24 00:08:25.596 04:52:00 -- nvmf/run.sh@29 -- # port=4424 00:08:25.596 04:52:00 -- nvmf/run.sh@30 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:25.596 04:52:00 -- nvmf/run.sh@32 -- # trid='trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' 00:08:25.596 04:52:00 -- nvmf/run.sh@33 -- # sed -e 's/"trsvcid": "4420"/"trsvcid": "4424"/' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/nvmf/fuzz_json.conf 00:08:25.596 04:52:00 -- nvmf/run.sh@36 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz -m 0x1 -s 512 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F 'trtype:tcp adrfam:IPv4 subnqn:nqn.2016-06.io.spdk:cnode1 traddr:127.0.0.1 trsvcid:4424' -c /tmp/fuzz_json_24.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 -Z 24 -r /var/tmp/spdk24.sock 00:08:25.596 [2024-11-08 04:52:00.527166] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:25.596 [2024-11-08 04:52:00.527236] [ DPDK EAL parameters: nvme_fuzz --no-shconf -c 0x1 -m 512 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3689592 ] 00:08:25.596 EAL: No free 2048 kB hugepages reported on node 1 00:08:25.596 [2024-11-08 04:52:00.702921] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.855 [2024-11-08 04:52:00.767781] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:25.855 [2024-11-08 04:52:00.767907] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.855 [2024-11-08 04:52:00.826236] tcp.c: 661:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:08:25.855 [2024-11-08 04:52:00.842546] tcp.c: 954:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4424 *** 00:08:25.855 INFO: Running with entropic power schedule (0xFF, 100). 00:08:25.855 INFO: Seed: 1303212924 00:08:25.855 INFO: Loaded 1 modules (344649 inline 8-bit counters): 344649 [0x2819f8c, 0x286e1d5), 00:08:25.855 INFO: Loaded 1 PC tables (344649 PCs): 344649 [0x286e1d8,0x2db0668), 00:08:25.855 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_nvmf_24 00:08:25.855 INFO: A corpus is not provided, starting from an empty corpus 00:08:25.855 #2 INITED exec/s: 0 rss: 60Mb 00:08:25.855 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:25.855 This may also happen if the target rejected all inputs we tried so far 00:08:25.855 [2024-11-08 04:52:00.897819] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.855 [2024-11-08 04:52:00.897849] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:25.855 [2024-11-08 04:52:00.897905] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:25.855 [2024-11-08 04:52:00.897921] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.113 NEW_FUNC[1/672]: 0x466208 in fuzz_nvm_compare_command /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:685 00:08:26.113 NEW_FUNC[2/672]: 0x476e88 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_nvme_fuzz/llvm_nvme_fuzz.c:780 00:08:26.113 #6 NEW cov: 11670 ft: 11661 corp: 2/57b lim: 100 exec/s: 0 rss: 68Mb L: 56/56 MS: 4 ChangeBit-ShuffleBytes-InsertByte-InsertRepeatedBytes- 00:08:26.113 [2024-11-08 04:52:01.218727] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.113 [2024-11-08 04:52:01.218763] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.113 [2024-11-08 04:52:01.218814] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.113 [2024-11-08 04:52:01.218830] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.113 [2024-11-08 04:52:01.218881] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.113 [2024-11-08 04:52:01.218897] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.372 #15 NEW cov: 11783 ft: 12524 corp: 3/125b lim: 100 exec/s: 0 rss: 68Mb L: 68/68 MS: 4 ShuffleBytes-ChangeByte-CopyPart-InsertRepeatedBytes- 00:08:26.372 [2024-11-08 04:52:01.258720] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.372 [2024-11-08 04:52:01.258748] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.372 [2024-11-08 04:52:01.258785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.372 [2024-11-08 04:52:01.258801] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.372 [2024-11-08 04:52:01.258834] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.372 [2024-11-08 04:52:01.258850] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.372 #16 NEW cov: 11789 ft: 12684 corp: 4/193b lim: 100 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 ShuffleBytes- 00:08:26.372 [2024-11-08 04:52:01.298867] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.372 [2024-11-08 04:52:01.298894] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.298931] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.298951] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.298995] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.299010] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.373 #22 NEW cov: 11874 ft: 12984 corp: 5/261b lim: 100 exec/s: 0 rss: 68Mb L: 68/68 MS: 1 ChangeBinInt- 00:08:26.373 [2024-11-08 04:52:01.338973] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073541779455 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.339000] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.339038] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.339055] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.339099] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.339114] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.373 #27 NEW cov: 11874 ft: 13203 corp: 6/335b lim: 100 exec/s: 0 rss: 68Mb L: 74/74 MS: 5 InsertByte-ChangeBinInt-EraseBytes-CopyPart-InsertRepeatedBytes- 00:08:26.373 [2024-11-08 04:52:01.379063] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.379089] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.379127] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.379143] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.379194] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.379210] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.373 #33 NEW cov: 11874 ft: 13308 corp: 7/403b lim: 100 exec/s: 0 rss: 68Mb L: 68/74 MS: 1 ChangeBit- 00:08:26.373 [2024-11-08 04:52:01.419458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.419485] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.419527] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.419540] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.419591] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.419607] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.419661] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.419677] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.419730] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.419745] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.373 #34 NEW cov: 11874 ft: 13761 corp: 8/503b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 CopyPart- 00:08:26.373 [2024-11-08 04:52:01.469335] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.469362] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.469415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.469432] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.373 [2024-11-08 04:52:01.469485] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.373 [2024-11-08 04:52:01.469501] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.632 #35 NEW cov: 11874 ft: 13772 corp: 9/571b lim: 100 exec/s: 0 rss: 68Mb L: 68/100 MS: 1 ChangeByte- 00:08:26.632 [2024-11-08 04:52:01.509707] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.509734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.509787] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.509802] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.509854] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.509870] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.509921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.509937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.509990] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:4 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.510005] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:4 cdw0:0 sqhd:0006 p:0 m:0 dnr:1 00:08:26.632 #36 NEW cov: 11874 ft: 13805 corp: 10/671b lim: 100 exec/s: 0 rss: 68Mb L: 100/100 MS: 1 InsertRepeatedBytes- 00:08:26.632 [2024-11-08 04:52:01.549704] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.549731] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.549769] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.549785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.549836] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.549852] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.549903] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.549918] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.632 #37 NEW cov: 11874 ft: 13834 corp: 11/768b lim: 100 exec/s: 0 rss: 68Mb L: 97/100 MS: 1 CopyPart- 00:08:26.632 [2024-11-08 04:52:01.589708] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.589734] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.589770] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.589785] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.589839] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.589854] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.632 #38 NEW cov: 11874 ft: 13847 corp: 12/836b lim: 100 exec/s: 0 rss: 68Mb L: 68/100 MS: 1 CrossOver- 00:08:26.632 [2024-11-08 04:52:01.629503] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.629533] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.632 #39 NEW cov: 11874 ft: 14750 corp: 13/875b lim: 100 exec/s: 0 rss: 68Mb L: 39/100 MS: 1 EraseBytes- 00:08:26.632 [2024-11-08 04:52:01.669948] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.669975] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.632 [2024-11-08 04:52:01.670012] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.632 [2024-11-08 04:52:01.670027] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.633 [2024-11-08 04:52:01.670080] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.633 [2024-11-08 04:52:01.670095] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.633 #40 NEW cov: 11874 ft: 14842 corp: 14/943b lim: 100 exec/s: 0 rss: 68Mb L: 68/100 MS: 1 ShuffleBytes- 00:08:26.633 [2024-11-08 04:52:01.710041] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.633 [2024-11-08 04:52:01.710070] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.633 [2024-11-08 04:52:01.710108] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.633 [2024-11-08 04:52:01.710123] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.633 [2024-11-08 04:52:01.710176] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.633 [2024-11-08 04:52:01.710191] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.633 #41 NEW cov: 11874 ft: 14883 corp: 15/1006b lim: 100 exec/s: 0 rss: 69Mb L: 63/100 MS: 1 EraseBytes- 00:08:26.892 [2024-11-08 04:52:01.750013] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565742349756786149 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.750040] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.750079] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.750094] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.892 NEW_FUNC[1/1]: 0x194e708 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:26.892 #42 NEW cov: 11897 ft: 14912 corp: 16/1062b lim: 100 exec/s: 0 rss: 69Mb L: 56/100 MS: 1 ChangeByte- 00:08:26.892 [2024-11-08 04:52:01.790406] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.790433] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.790475] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.790492] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.790546] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446528831423512575 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.790562] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.790615] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4340410370284600380 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.790630] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.892 #43 NEW cov: 11897 ft: 14959 corp: 17/1158b lim: 100 exec/s: 0 rss: 69Mb L: 96/100 MS: 1 InsertRepeatedBytes- 00:08:26.892 [2024-11-08 04:52:01.830513] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967041 len:2816 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.830545] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.830592] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.830608] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.830667] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446528831423512575 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.830682] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.830735] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4340410370284600380 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.830751] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.892 #44 NEW cov: 11897 ft: 14973 corp: 18/1254b lim: 100 exec/s: 0 rss: 69Mb L: 96/100 MS: 1 ChangeBinInt- 00:08:26.892 [2024-11-08 04:52:01.870499] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.870530] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.870584] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.870602] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.870655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.870681] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.892 #45 NEW cov: 11897 ft: 14996 corp: 19/1322b lim: 100 exec/s: 45 rss: 69Mb L: 68/100 MS: 1 CopyPart- 00:08:26.892 [2024-11-08 04:52:01.910713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.910741] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.910782] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.892 [2024-11-08 04:52:01.910797] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.892 [2024-11-08 04:52:01.910849] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446528831423512575 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.910864] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.893 [2024-11-08 04:52:01.910917] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:4340410370284600380 len:15421 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.910933] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:26.893 #46 NEW cov: 11897 ft: 15009 corp: 20/1418b lim: 100 exec/s: 46 rss: 69Mb L: 96/100 MS: 1 ChangeBinInt- 00:08:26.893 [2024-11-08 04:52:01.950694] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.950721] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.893 [2024-11-08 04:52:01.950758] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.950773] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:26.893 [2024-11-08 04:52:01.950829] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.950844] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:26.893 #47 NEW cov: 11897 ft: 15031 corp: 21/1486b lim: 100 exec/s: 47 rss: 69Mb L: 68/100 MS: 1 ChangeByte- 00:08:26.893 [2024-11-08 04:52:01.990686] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.990723] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:26.893 [2024-11-08 04:52:01.990778] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:26.893 [2024-11-08 04:52:01.990793] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.152 #48 NEW cov: 11897 ft: 15043 corp: 22/1539b lim: 100 exec/s: 48 rss: 69Mb L: 53/100 MS: 1 EraseBytes- 00:08:27.152 [2024-11-08 04:52:02.030777] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.030805] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.030859] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.030875] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.152 #49 NEW cov: 11897 ft: 15106 corp: 23/1585b lim: 100 exec/s: 49 rss: 69Mb L: 46/100 MS: 1 EraseBytes- 00:08:27.152 [2024-11-08 04:52:02.071159] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.071187] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.071228] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.071243] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.071296] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.071312] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.071365] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.071382] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.152 #50 NEW cov: 11897 ft: 15149 corp: 24/1679b lim: 100 exec/s: 50 rss: 69Mb L: 94/100 MS: 1 InsertRepeatedBytes- 00:08:27.152 [2024-11-08 04:52:02.111136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.111163] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.111211] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.111229] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.111280] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.111295] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.152 #51 NEW cov: 11897 ft: 15173 corp: 25/1747b lim: 100 exec/s: 51 rss: 69Mb L: 68/100 MS: 1 ChangeByte- 00:08:27.152 [2024-11-08 04:52:02.151281] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.151308] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.151345] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.151361] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.152 [2024-11-08 04:52:02.151415] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:4864 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.151430] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.152 #52 NEW cov: 11897 ft: 15179 corp: 26/1815b lim: 100 exec/s: 52 rss: 69Mb L: 68/100 MS: 1 ChangeByte- 00:08:27.152 [2024-11-08 04:52:02.191111] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.191137] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 #53 NEW cov: 11897 ft: 15212 corp: 27/1854b lim: 100 exec/s: 53 rss: 69Mb L: 39/100 MS: 1 ChangeBit- 00:08:27.152 [2024-11-08 04:52:02.231259] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:6 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.152 [2024-11-08 04:52:02.231286] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.152 #54 NEW cov: 11897 ft: 15251 corp: 28/1893b lim: 100 exec/s: 54 rss: 69Mb L: 39/100 MS: 1 ChangeBinInt- 00:08:27.412 [2024-11-08 04:52:02.271681] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.271709] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.271757] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.271772] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.271827] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.271842] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.412 #55 NEW cov: 11897 ft: 15264 corp: 29/1954b lim: 100 exec/s: 55 rss: 69Mb L: 61/100 MS: 1 EraseBytes- 00:08:27.412 [2024-11-08 04:52:02.311933] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.311962] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.312000] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.312016] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.312070] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.312086] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.312139] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.312155] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.412 #56 NEW cov: 11897 ft: 15293 corp: 30/2051b lim: 100 exec/s: 56 rss: 69Mb L: 97/100 MS: 1 ChangeBit- 00:08:27.412 [2024-11-08 04:52:02.351731] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565742349756786149 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.351757] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.351809] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.351825] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 #57 NEW cov: 11897 ft: 15298 corp: 31/2107b lim: 100 exec/s: 57 rss: 70Mb L: 56/100 MS: 1 ChangeBinInt- 00:08:27.412 [2024-11-08 04:52:02.392004] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.392031] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.392068] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.392083] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.392136] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.392153] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.412 #58 NEW cov: 11897 ft: 15308 corp: 32/2175b lim: 100 exec/s: 58 rss: 70Mb L: 68/100 MS: 1 ChangeBit- 00:08:27.412 [2024-11-08 04:52:02.432250] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18427811295228067839 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.432282] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.432326] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493558414524 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.432341] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.432394] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952781321223356 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.432413] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.432468] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.432484] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.412 #59 NEW cov: 11897 ft: 15351 corp: 33/2263b lim: 100 exec/s: 59 rss: 70Mb L: 88/100 MS: 1 InsertRepeatedBytes- 00:08:27.412 [2024-11-08 04:52:02.471916] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:4294967295 len:518 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.471943] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 #60 NEW cov: 11897 ft: 15388 corp: 34/2302b lim: 100 exec/s: 60 rss: 70Mb L: 39/100 MS: 1 ChangeBit- 00:08:27.412 [2024-11-08 04:52:02.512488] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18427811295228067839 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.512515] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.512568] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493562608828 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.512601] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.512655] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952781321223356 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.512670] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.412 [2024-11-08 04:52:02.512724] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.412 [2024-11-08 04:52:02.512739] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.672 #61 NEW cov: 11897 ft: 15398 corp: 35/2390b lim: 100 exec/s: 61 rss: 70Mb L: 88/100 MS: 1 ChangeBit- 00:08:27.672 [2024-11-08 04:52:02.552469] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.552496] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.552538] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446499982128185343 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.552553] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.552607] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.552622] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.672 #62 NEW cov: 11897 ft: 15400 corp: 36/2459b lim: 100 exec/s: 62 rss: 70Mb L: 69/100 MS: 1 InsertByte- 00:08:27.672 [2024-11-08 04:52:02.592715] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18427811295228067839 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.592743] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.592784] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:13599952493562608828 len:48317 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.592798] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.592851] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:13599952781321223356 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.592867] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.592921] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:18446463702539436031 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.592937] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.672 #63 NEW cov: 11897 ft: 15408 corp: 37/2547b lim: 100 exec/s: 63 rss: 70Mb L: 88/100 MS: 1 ChangeBinInt- 00:08:27.672 [2024-11-08 04:52:02.632713] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.632740] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.632785] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.632799] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.632852] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.632868] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.672 #64 NEW cov: 11897 ft: 15425 corp: 38/2615b lim: 100 exec/s: 64 rss: 70Mb L: 68/100 MS: 1 ChangeByte- 00:08:27.672 [2024-11-08 04:52:02.672828] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.672855] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.672897] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.672912] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.672968] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.672983] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.672 #65 NEW cov: 11897 ft: 15431 corp: 39/2684b lim: 100 exec/s: 65 rss: 70Mb L: 69/100 MS: 1 CopyPart- 00:08:27.672 [2024-11-08 04:52:02.712944] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.712972] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.713010] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.713026] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.713082] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.713099] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.672 #66 NEW cov: 11897 ft: 15450 corp: 40/2752b lim: 100 exec/s: 66 rss: 70Mb L: 68/100 MS: 1 ChangeBit- 00:08:27.672 [2024-11-08 04:52:02.753021] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.753048] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.753087] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.753102] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.672 [2024-11-08 04:52:02.753157] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.672 [2024-11-08 04:52:02.753172] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.932 #67 NEW cov: 11897 ft: 15498 corp: 41/2822b lim: 100 exec/s: 67 rss: 70Mb L: 70/100 MS: 1 CopyPart- 00:08:27.932 [2024-11-08 04:52:02.793186] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.793212] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.793247] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:1 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.793263] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.793317] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.793333] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.932 #68 NEW cov: 11897 ft: 15510 corp: 42/2886b lim: 100 exec/s: 68 rss: 70Mb L: 64/100 MS: 1 CMP- DE: "\001\000\000\000,\342\234\323"- 00:08:27.932 [2024-11-08 04:52:02.833393] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.833420] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.833458] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.833474] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.833530] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:16565899579986666981 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.833546] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.833600] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:3 nsid:0 lba:16565899579919558117 len:58854 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.833618] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:3 cdw0:0 sqhd:0005 p:0 m:0 dnr:1 00:08:27.932 #69 NEW cov: 11897 ft: 15513 corp: 43/2985b lim: 100 exec/s: 69 rss: 70Mb L: 99/100 MS: 1 CrossOver- 00:08:27.932 [2024-11-08 04:52:02.873392] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:0 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.873419] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:0 cdw0:0 sqhd:0002 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.873473] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:1 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.873490] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:1 cdw0:0 sqhd:0003 p:0 m:0 dnr:1 00:08:27.932 [2024-11-08 04:52:02.873548] nvme_qpair.c: 247:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:2 nsid:0 lba:18446744073709551615 len:65536 SGL DATA BLOCK OFFSET 0x0 len:0x1000 00:08:27.932 [2024-11-08 04:52:02.873566] nvme_qpair.c: 477:spdk_nvme_print_completion: *NOTICE*: INVALID NAMESPACE OR FORMAT (00/0b) qid:1 cid:2 cdw0:0 sqhd:0004 p:0 m:0 dnr:1 00:08:27.932 #70 NEW cov: 11897 ft: 15517 corp: 44/3053b lim: 100 exec/s: 35 rss: 70Mb L: 68/100 MS: 1 CopyPart- 00:08:27.932 #70 DONE cov: 11897 ft: 15517 corp: 44/3053b lim: 100 exec/s: 35 rss: 70Mb 00:08:27.932 ###### Recommended dictionary. ###### 00:08:27.932 "\001\000\000\000,\342\234\323" # Uses: 0 00:08:27.932 ###### End of recommended dictionary. ###### 00:08:27.932 Done 70 runs in 2 second(s) 00:08:27.932 04:52:03 -- nvmf/run.sh@46 -- # rm -rf /tmp/fuzz_json_24.conf 00:08:27.932 04:52:03 -- ../common.sh@72 -- # (( i++ )) 00:08:27.932 04:52:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:27.932 04:52:03 -- nvmf/run.sh@71 -- # trap - SIGINT SIGTERM EXIT 00:08:27.932 00:08:27.932 real 1m4.296s 00:08:27.932 user 1m40.525s 00:08:27.932 sys 0m7.381s 00:08:27.932 04:52:03 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:27.932 04:52:03 -- common/autotest_common.sh@10 -- # set +x 00:08:27.932 ************************************ 00:08:27.932 END TEST nvmf_fuzz 00:08:27.932 ************************************ 00:08:28.192 04:52:03 -- fuzz/llvm.sh@17 -- # for fuzzer in "${fuzzers[@]}" 00:08:28.192 04:52:03 -- fuzz/llvm.sh@18 -- # case "$fuzzer" in 00:08:28.192 04:52:03 -- fuzz/llvm.sh@20 -- # run_test vfio_fuzz /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:28.192 04:52:03 -- common/autotest_common.sh@1087 -- # '[' 2 -le 1 ']' 00:08:28.192 04:52:03 -- common/autotest_common.sh@1093 -- # xtrace_disable 00:08:28.192 04:52:03 -- common/autotest_common.sh@10 -- # set +x 00:08:28.192 ************************************ 00:08:28.192 START TEST vfio_fuzz 00:08:28.192 ************************************ 00:08:28.192 04:52:03 -- common/autotest_common.sh@1114 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/run.sh 00:08:28.192 * Looking for test storage... 00:08:28.192 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.192 04:52:03 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:28.192 04:52:03 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:28.192 04:52:03 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:28.192 04:52:03 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:28.192 04:52:03 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:28.192 04:52:03 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:28.192 04:52:03 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:28.192 04:52:03 -- scripts/common.sh@335 -- # IFS=.-: 00:08:28.193 04:52:03 -- scripts/common.sh@335 -- # read -ra ver1 00:08:28.193 04:52:03 -- scripts/common.sh@336 -- # IFS=.-: 00:08:28.193 04:52:03 -- scripts/common.sh@336 -- # read -ra ver2 00:08:28.193 04:52:03 -- scripts/common.sh@337 -- # local 'op=<' 00:08:28.193 04:52:03 -- scripts/common.sh@339 -- # ver1_l=2 00:08:28.193 04:52:03 -- scripts/common.sh@340 -- # ver2_l=1 00:08:28.193 04:52:03 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:28.193 04:52:03 -- scripts/common.sh@343 -- # case "$op" in 00:08:28.193 04:52:03 -- scripts/common.sh@344 -- # : 1 00:08:28.193 04:52:03 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:28.193 04:52:03 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:28.193 04:52:03 -- scripts/common.sh@364 -- # decimal 1 00:08:28.193 04:52:03 -- scripts/common.sh@352 -- # local d=1 00:08:28.193 04:52:03 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:28.193 04:52:03 -- scripts/common.sh@354 -- # echo 1 00:08:28.193 04:52:03 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:28.193 04:52:03 -- scripts/common.sh@365 -- # decimal 2 00:08:28.193 04:52:03 -- scripts/common.sh@352 -- # local d=2 00:08:28.193 04:52:03 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:28.193 04:52:03 -- scripts/common.sh@354 -- # echo 2 00:08:28.193 04:52:03 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:28.193 04:52:03 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:28.193 04:52:03 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:28.193 04:52:03 -- scripts/common.sh@367 -- # return 0 00:08:28.193 04:52:03 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:28.193 04:52:03 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:28.193 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.193 --rc genhtml_branch_coverage=1 00:08:28.193 --rc genhtml_function_coverage=1 00:08:28.193 --rc genhtml_legend=1 00:08:28.193 --rc geninfo_all_blocks=1 00:08:28.193 --rc geninfo_unexecuted_blocks=1 00:08:28.193 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.193 ' 00:08:28.193 04:52:03 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:28.193 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.193 --rc genhtml_branch_coverage=1 00:08:28.193 --rc genhtml_function_coverage=1 00:08:28.193 --rc genhtml_legend=1 00:08:28.193 --rc geninfo_all_blocks=1 00:08:28.193 --rc geninfo_unexecuted_blocks=1 00:08:28.193 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.193 ' 00:08:28.193 04:52:03 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:28.193 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.193 --rc genhtml_branch_coverage=1 00:08:28.193 --rc genhtml_function_coverage=1 00:08:28.193 --rc genhtml_legend=1 00:08:28.193 --rc geninfo_all_blocks=1 00:08:28.193 --rc geninfo_unexecuted_blocks=1 00:08:28.193 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.193 ' 00:08:28.193 04:52:03 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:28.193 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.193 --rc genhtml_branch_coverage=1 00:08:28.193 --rc genhtml_function_coverage=1 00:08:28.193 --rc genhtml_legend=1 00:08:28.193 --rc geninfo_all_blocks=1 00:08:28.193 --rc geninfo_unexecuted_blocks=1 00:08:28.193 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.193 ' 00:08:28.193 04:52:03 -- vfio/run.sh@55 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/setup/common.sh 00:08:28.193 04:52:03 -- setup/common.sh@6 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/autotest_common.sh 00:08:28.193 04:52:03 -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:08:28.193 04:52:03 -- common/autotest_common.sh@34 -- # set -e 00:08:28.193 04:52:03 -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:08:28.193 04:52:03 -- common/autotest_common.sh@36 -- # shopt -s extglob 00:08:28.193 04:52:03 -- common/autotest_common.sh@38 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh ]] 00:08:28.193 04:52:03 -- common/autotest_common.sh@39 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/build_config.sh 00:08:28.193 04:52:03 -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:08:28.193 04:52:03 -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:08:28.193 04:52:03 -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:08:28.193 04:52:03 -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:08:28.193 04:52:03 -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:08:28.193 04:52:03 -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:08:28.193 04:52:03 -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:08:28.193 04:52:03 -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:08:28.193 04:52:03 -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:08:28.193 04:52:03 -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:08:28.193 04:52:03 -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:08:28.193 04:52:03 -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:08:28.193 04:52:03 -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:08:28.193 04:52:03 -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:08:28.193 04:52:03 -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:08:28.193 04:52:03 -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:08:28.193 04:52:03 -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:28.193 04:52:03 -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:08:28.193 04:52:03 -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:08:28.193 04:52:03 -- common/build_config.sh@22 -- # CONFIG_CET=n 00:08:28.193 04:52:03 -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:08:28.193 04:52:03 -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:08:28.193 04:52:03 -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:08:28.193 04:52:03 -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:08:28.193 04:52:03 -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:08:28.193 04:52:03 -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:08:28.193 04:52:03 -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:08:28.193 04:52:03 -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:08:28.193 04:52:03 -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:08:28.193 04:52:03 -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:08:28.193 04:52:03 -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB=/usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:28.193 04:52:03 -- common/build_config.sh@35 -- # CONFIG_FUZZER=y 00:08:28.193 04:52:03 -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:28.193 04:52:03 -- common/build_config.sh@37 -- # CONFIG_CRYPTO=n 00:08:28.193 04:52:03 -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:08:28.193 04:52:03 -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:08:28.193 04:52:03 -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:08:28.193 04:52:03 -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:08:28.193 04:52:03 -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:08:28.193 04:52:03 -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:08:28.193 04:52:03 -- common/build_config.sh@46 -- # CONFIG_COVERAGE=y 00:08:28.193 04:52:03 -- common/build_config.sh@47 -- # CONFIG_RDMA=y 00:08:28.193 04:52:03 -- common/build_config.sh@48 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:08:28.193 04:52:03 -- common/build_config.sh@49 -- # CONFIG_URING_PATH= 00:08:28.193 04:52:03 -- common/build_config.sh@50 -- # CONFIG_XNVME=n 00:08:28.193 04:52:03 -- common/build_config.sh@51 -- # CONFIG_VFIO_USER=y 00:08:28.193 04:52:03 -- common/build_config.sh@52 -- # CONFIG_ARCH=native 00:08:28.193 04:52:03 -- common/build_config.sh@53 -- # CONFIG_URING_ZNS=n 00:08:28.193 04:52:03 -- common/build_config.sh@54 -- # CONFIG_WERROR=y 00:08:28.193 04:52:03 -- common/build_config.sh@55 -- # CONFIG_HAVE_LIBBSD=n 00:08:28.193 04:52:03 -- common/build_config.sh@56 -- # CONFIG_UBSAN=y 00:08:28.193 04:52:03 -- common/build_config.sh@57 -- # CONFIG_IPSEC_MB_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@58 -- # CONFIG_GOLANG=n 00:08:28.193 04:52:03 -- common/build_config.sh@59 -- # CONFIG_ISAL=y 00:08:28.193 04:52:03 -- common/build_config.sh@60 -- # CONFIG_IDXD_KERNEL=y 00:08:28.193 04:52:03 -- common/build_config.sh@61 -- # CONFIG_DPDK_LIB_DIR= 00:08:28.193 04:52:03 -- common/build_config.sh@62 -- # CONFIG_RDMA_PROV=verbs 00:08:28.193 04:52:03 -- common/build_config.sh@63 -- # CONFIG_APPS=y 00:08:28.193 04:52:03 -- common/build_config.sh@64 -- # CONFIG_SHARED=n 00:08:28.193 04:52:03 -- common/build_config.sh@65 -- # CONFIG_FC_PATH= 00:08:28.193 04:52:03 -- common/build_config.sh@66 -- # CONFIG_DPDK_PKG_CONFIG=n 00:08:28.193 04:52:03 -- common/build_config.sh@67 -- # CONFIG_FC=n 00:08:28.193 04:52:03 -- common/build_config.sh@68 -- # CONFIG_AVAHI=n 00:08:28.193 04:52:03 -- common/build_config.sh@69 -- # CONFIG_FIO_PLUGIN=y 00:08:28.193 04:52:03 -- common/build_config.sh@70 -- # CONFIG_RAID5F=n 00:08:28.193 04:52:03 -- common/build_config.sh@71 -- # CONFIG_EXAMPLES=y 00:08:28.193 04:52:03 -- common/build_config.sh@72 -- # CONFIG_TESTS=y 00:08:28.193 04:52:03 -- common/build_config.sh@73 -- # CONFIG_CRYPTO_MLX5=n 00:08:28.193 04:52:03 -- common/build_config.sh@74 -- # CONFIG_MAX_LCORES= 00:08:28.194 04:52:03 -- common/build_config.sh@75 -- # CONFIG_IPSEC_MB=n 00:08:28.194 04:52:03 -- common/build_config.sh@76 -- # CONFIG_DEBUG=y 00:08:28.194 04:52:03 -- common/build_config.sh@77 -- # CONFIG_DPDK_COMPRESSDEV=n 00:08:28.194 04:52:03 -- common/build_config.sh@78 -- # CONFIG_CROSS_PREFIX= 00:08:28.194 04:52:03 -- common/build_config.sh@79 -- # CONFIG_URING=n 00:08:28.194 04:52:03 -- common/autotest_common.sh@48 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:28.194 04:52:03 -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common/applications.sh 00:08:28.194 04:52:03 -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:28.194 04:52:03 -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/common 00:08:28.194 04:52:03 -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:28.194 04:52:03 -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.194 04:52:03 -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app 00:08:28.194 04:52:03 -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.194 04:52:03 -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:08:28.194 04:52:03 -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:08:28.194 04:52:03 -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:08:28.194 04:52:03 -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:08:28.194 04:52:03 -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:08:28.194 04:52:03 -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:08:28.194 04:52:03 -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/include/spdk/config.h ]] 00:08:28.194 04:52:03 -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:08:28.194 #define SPDK_CONFIG_H 00:08:28.194 #define SPDK_CONFIG_APPS 1 00:08:28.194 #define SPDK_CONFIG_ARCH native 00:08:28.194 #undef SPDK_CONFIG_ASAN 00:08:28.194 #undef SPDK_CONFIG_AVAHI 00:08:28.194 #undef SPDK_CONFIG_CET 00:08:28.194 #define SPDK_CONFIG_COVERAGE 1 00:08:28.194 #define SPDK_CONFIG_CROSS_PREFIX 00:08:28.194 #undef SPDK_CONFIG_CRYPTO 00:08:28.194 #undef SPDK_CONFIG_CRYPTO_MLX5 00:08:28.194 #undef SPDK_CONFIG_CUSTOMOCF 00:08:28.194 #undef SPDK_CONFIG_DAOS 00:08:28.194 #define SPDK_CONFIG_DAOS_DIR 00:08:28.194 #define SPDK_CONFIG_DEBUG 1 00:08:28.194 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:08:28.194 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build 00:08:28.194 #define SPDK_CONFIG_DPDK_INC_DIR 00:08:28.194 #define SPDK_CONFIG_DPDK_LIB_DIR 00:08:28.194 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:08:28.194 #define SPDK_CONFIG_ENV /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/env_dpdk 00:08:28.194 #define SPDK_CONFIG_EXAMPLES 1 00:08:28.194 #undef SPDK_CONFIG_FC 00:08:28.194 #define SPDK_CONFIG_FC_PATH 00:08:28.194 #define SPDK_CONFIG_FIO_PLUGIN 1 00:08:28.194 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:08:28.194 #undef SPDK_CONFIG_FUSE 00:08:28.194 #define SPDK_CONFIG_FUZZER 1 00:08:28.194 #define SPDK_CONFIG_FUZZER_LIB /usr/lib/clang/17/lib/x86_64-redhat-linux-gnu/libclang_rt.fuzzer_no_main.a 00:08:28.194 #undef SPDK_CONFIG_GOLANG 00:08:28.194 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:08:28.194 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:08:28.194 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:08:28.194 #undef SPDK_CONFIG_HAVE_LIBBSD 00:08:28.194 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:08:28.194 #define SPDK_CONFIG_IDXD 1 00:08:28.194 #define SPDK_CONFIG_IDXD_KERNEL 1 00:08:28.194 #undef SPDK_CONFIG_IPSEC_MB 00:08:28.194 #define SPDK_CONFIG_IPSEC_MB_DIR 00:08:28.194 #define SPDK_CONFIG_ISAL 1 00:08:28.194 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:08:28.194 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:08:28.194 #define SPDK_CONFIG_LIBDIR 00:08:28.194 #undef SPDK_CONFIG_LTO 00:08:28.194 #define SPDK_CONFIG_MAX_LCORES 00:08:28.194 #define SPDK_CONFIG_NVME_CUSE 1 00:08:28.194 #undef SPDK_CONFIG_OCF 00:08:28.194 #define SPDK_CONFIG_OCF_PATH 00:08:28.194 #define SPDK_CONFIG_OPENSSL_PATH 00:08:28.194 #undef SPDK_CONFIG_PGO_CAPTURE 00:08:28.194 #undef SPDK_CONFIG_PGO_USE 00:08:28.194 #define SPDK_CONFIG_PREFIX /usr/local 00:08:28.194 #undef SPDK_CONFIG_RAID5F 00:08:28.194 #undef SPDK_CONFIG_RBD 00:08:28.194 #define SPDK_CONFIG_RDMA 1 00:08:28.194 #define SPDK_CONFIG_RDMA_PROV verbs 00:08:28.194 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:08:28.194 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:08:28.194 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:08:28.194 #undef SPDK_CONFIG_SHARED 00:08:28.194 #undef SPDK_CONFIG_SMA 00:08:28.194 #define SPDK_CONFIG_TESTS 1 00:08:28.194 #undef SPDK_CONFIG_TSAN 00:08:28.194 #define SPDK_CONFIG_UBLK 1 00:08:28.194 #define SPDK_CONFIG_UBSAN 1 00:08:28.194 #undef SPDK_CONFIG_UNIT_TESTS 00:08:28.194 #undef SPDK_CONFIG_URING 00:08:28.194 #define SPDK_CONFIG_URING_PATH 00:08:28.194 #undef SPDK_CONFIG_URING_ZNS 00:08:28.194 #undef SPDK_CONFIG_USDT 00:08:28.194 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:08:28.194 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:08:28.194 #define SPDK_CONFIG_VFIO_USER 1 00:08:28.194 #define SPDK_CONFIG_VFIO_USER_DIR 00:08:28.194 #define SPDK_CONFIG_VHOST 1 00:08:28.194 #define SPDK_CONFIG_VIRTIO 1 00:08:28.194 #undef SPDK_CONFIG_VTUNE 00:08:28.194 #define SPDK_CONFIG_VTUNE_DIR 00:08:28.194 #define SPDK_CONFIG_WERROR 1 00:08:28.194 #define SPDK_CONFIG_WPDK_DIR 00:08:28.194 #undef SPDK_CONFIG_XNVME 00:08:28.194 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:08:28.194 04:52:03 -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:08:28.194 04:52:03 -- common/autotest_common.sh@49 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:08:28.194 04:52:03 -- scripts/common.sh@433 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:28.194 04:52:03 -- scripts/common.sh@441 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:28.194 04:52:03 -- scripts/common.sh@442 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:28.194 04:52:03 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.194 04:52:03 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.194 04:52:03 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.194 04:52:03 -- paths/export.sh@5 -- # export PATH 00:08:28.194 04:52:03 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:28.194 04:52:03 -- common/autotest_common.sh@50 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:28.194 04:52:03 -- pm/common@6 -- # dirname /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/common 00:08:28.194 04:52:03 -- pm/common@6 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:28.194 04:52:03 -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm 00:08:28.194 04:52:03 -- pm/common@7 -- # readlink -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/perf/pm/../../../ 00:08:28.194 04:52:03 -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:08:28.194 04:52:03 -- pm/common@16 -- # TEST_TAG=N/A 00:08:28.194 04:52:03 -- pm/common@17 -- # TEST_TAG_FILE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/.run_test_name 00:08:28.194 04:52:03 -- common/autotest_common.sh@52 -- # : 1 00:08:28.194 04:52:03 -- common/autotest_common.sh@53 -- # export RUN_NIGHTLY 00:08:28.194 04:52:03 -- common/autotest_common.sh@56 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@57 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:08:28.194 04:52:03 -- common/autotest_common.sh@58 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@59 -- # export SPDK_RUN_VALGRIND 00:08:28.194 04:52:03 -- common/autotest_common.sh@60 -- # : 1 00:08:28.194 04:52:03 -- common/autotest_common.sh@61 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:08:28.194 04:52:03 -- common/autotest_common.sh@62 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@63 -- # export SPDK_TEST_UNITTEST 00:08:28.194 04:52:03 -- common/autotest_common.sh@64 -- # : 00:08:28.194 04:52:03 -- common/autotest_common.sh@65 -- # export SPDK_TEST_AUTOBUILD 00:08:28.194 04:52:03 -- common/autotest_common.sh@66 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@67 -- # export SPDK_TEST_RELEASE_BUILD 00:08:28.194 04:52:03 -- common/autotest_common.sh@68 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@69 -- # export SPDK_TEST_ISAL 00:08:28.194 04:52:03 -- common/autotest_common.sh@70 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@71 -- # export SPDK_TEST_ISCSI 00:08:28.194 04:52:03 -- common/autotest_common.sh@72 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@73 -- # export SPDK_TEST_ISCSI_INITIATOR 00:08:28.194 04:52:03 -- common/autotest_common.sh@74 -- # : 0 00:08:28.194 04:52:03 -- common/autotest_common.sh@75 -- # export SPDK_TEST_NVME 00:08:28.194 04:52:03 -- common/autotest_common.sh@76 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@77 -- # export SPDK_TEST_NVME_PMR 00:08:28.456 04:52:03 -- common/autotest_common.sh@78 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@79 -- # export SPDK_TEST_NVME_BP 00:08:28.456 04:52:03 -- common/autotest_common.sh@80 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME_CLI 00:08:28.456 04:52:03 -- common/autotest_common.sh@82 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_CUSE 00:08:28.456 04:52:03 -- common/autotest_common.sh@84 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_FDP 00:08:28.456 04:52:03 -- common/autotest_common.sh@86 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVMF 00:08:28.456 04:52:03 -- common/autotest_common.sh@88 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@89 -- # export SPDK_TEST_VFIOUSER 00:08:28.456 04:52:03 -- common/autotest_common.sh@90 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@91 -- # export SPDK_TEST_VFIOUSER_QEMU 00:08:28.456 04:52:03 -- common/autotest_common.sh@92 -- # : 1 00:08:28.456 04:52:03 -- common/autotest_common.sh@93 -- # export SPDK_TEST_FUZZER 00:08:28.456 04:52:03 -- common/autotest_common.sh@94 -- # : 1 00:08:28.456 04:52:03 -- common/autotest_common.sh@95 -- # export SPDK_TEST_FUZZER_SHORT 00:08:28.456 04:52:03 -- common/autotest_common.sh@96 -- # : rdma 00:08:28.456 04:52:03 -- common/autotest_common.sh@97 -- # export SPDK_TEST_NVMF_TRANSPORT 00:08:28.456 04:52:03 -- common/autotest_common.sh@98 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@99 -- # export SPDK_TEST_RBD 00:08:28.456 04:52:03 -- common/autotest_common.sh@100 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@101 -- # export SPDK_TEST_VHOST 00:08:28.456 04:52:03 -- common/autotest_common.sh@102 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@103 -- # export SPDK_TEST_BLOCKDEV 00:08:28.456 04:52:03 -- common/autotest_common.sh@104 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@105 -- # export SPDK_TEST_IOAT 00:08:28.456 04:52:03 -- common/autotest_common.sh@106 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@107 -- # export SPDK_TEST_BLOBFS 00:08:28.456 04:52:03 -- common/autotest_common.sh@108 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@109 -- # export SPDK_TEST_VHOST_INIT 00:08:28.456 04:52:03 -- common/autotest_common.sh@110 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@111 -- # export SPDK_TEST_LVOL 00:08:28.456 04:52:03 -- common/autotest_common.sh@112 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@113 -- # export SPDK_TEST_VBDEV_COMPRESS 00:08:28.456 04:52:03 -- common/autotest_common.sh@114 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@115 -- # export SPDK_RUN_ASAN 00:08:28.456 04:52:03 -- common/autotest_common.sh@116 -- # : 1 00:08:28.456 04:52:03 -- common/autotest_common.sh@117 -- # export SPDK_RUN_UBSAN 00:08:28.456 04:52:03 -- common/autotest_common.sh@118 -- # : 00:08:28.456 04:52:03 -- common/autotest_common.sh@119 -- # export SPDK_RUN_EXTERNAL_DPDK 00:08:28.456 04:52:03 -- common/autotest_common.sh@120 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@121 -- # export SPDK_RUN_NON_ROOT 00:08:28.456 04:52:03 -- common/autotest_common.sh@122 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@123 -- # export SPDK_TEST_CRYPTO 00:08:28.456 04:52:03 -- common/autotest_common.sh@124 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@125 -- # export SPDK_TEST_FTL 00:08:28.456 04:52:03 -- common/autotest_common.sh@126 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@127 -- # export SPDK_TEST_OCF 00:08:28.456 04:52:03 -- common/autotest_common.sh@128 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@129 -- # export SPDK_TEST_VMD 00:08:28.456 04:52:03 -- common/autotest_common.sh@130 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@131 -- # export SPDK_TEST_OPAL 00:08:28.456 04:52:03 -- common/autotest_common.sh@132 -- # : 00:08:28.456 04:52:03 -- common/autotest_common.sh@133 -- # export SPDK_TEST_NATIVE_DPDK 00:08:28.456 04:52:03 -- common/autotest_common.sh@134 -- # : true 00:08:28.456 04:52:03 -- common/autotest_common.sh@135 -- # export SPDK_AUTOTEST_X 00:08:28.456 04:52:03 -- common/autotest_common.sh@136 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@137 -- # export SPDK_TEST_RAID5 00:08:28.456 04:52:03 -- common/autotest_common.sh@138 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@139 -- # export SPDK_TEST_URING 00:08:28.456 04:52:03 -- common/autotest_common.sh@140 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@141 -- # export SPDK_TEST_USDT 00:08:28.456 04:52:03 -- common/autotest_common.sh@142 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@143 -- # export SPDK_TEST_USE_IGB_UIO 00:08:28.456 04:52:03 -- common/autotest_common.sh@144 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@145 -- # export SPDK_TEST_SCHEDULER 00:08:28.456 04:52:03 -- common/autotest_common.sh@146 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@147 -- # export SPDK_TEST_SCANBUILD 00:08:28.456 04:52:03 -- common/autotest_common.sh@148 -- # : 00:08:28.456 04:52:03 -- common/autotest_common.sh@149 -- # export SPDK_TEST_NVMF_NICS 00:08:28.456 04:52:03 -- common/autotest_common.sh@150 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@151 -- # export SPDK_TEST_SMA 00:08:28.456 04:52:03 -- common/autotest_common.sh@152 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@153 -- # export SPDK_TEST_DAOS 00:08:28.456 04:52:03 -- common/autotest_common.sh@154 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@155 -- # export SPDK_TEST_XNVME 00:08:28.456 04:52:03 -- common/autotest_common.sh@156 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@157 -- # export SPDK_TEST_ACCEL_DSA 00:08:28.456 04:52:03 -- common/autotest_common.sh@158 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@159 -- # export SPDK_TEST_ACCEL_IAA 00:08:28.456 04:52:03 -- common/autotest_common.sh@160 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@161 -- # export SPDK_TEST_ACCEL_IOAT 00:08:28.456 04:52:03 -- common/autotest_common.sh@163 -- # : 00:08:28.456 04:52:03 -- common/autotest_common.sh@164 -- # export SPDK_TEST_FUZZER_TARGET 00:08:28.456 04:52:03 -- common/autotest_common.sh@165 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@166 -- # export SPDK_TEST_NVMF_MDNS 00:08:28.456 04:52:03 -- common/autotest_common.sh@167 -- # : 0 00:08:28.456 04:52:03 -- common/autotest_common.sh@168 -- # export SPDK_JSONRPC_GO_CLIENT 00:08:28.456 04:52:03 -- common/autotest_common.sh@171 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@171 -- # SPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@172 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@172 -- # DPDK_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@173 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@173 -- # VFIO_LIB_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@174 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@174 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:08:28.456 04:52:03 -- common/autotest_common.sh@177 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:08:28.456 04:52:03 -- common/autotest_common.sh@177 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:08:28.456 04:52:03 -- common/autotest_common.sh@181 -- # export PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:28.456 04:52:03 -- common/autotest_common.sh@181 -- # PYTHONPATH=:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/python 00:08:28.456 04:52:03 -- common/autotest_common.sh@185 -- # export PYTHONDONTWRITEBYTECODE=1 00:08:28.456 04:52:03 -- common/autotest_common.sh@185 -- # PYTHONDONTWRITEBYTECODE=1 00:08:28.456 04:52:03 -- common/autotest_common.sh@189 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:28.456 04:52:03 -- common/autotest_common.sh@189 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:08:28.456 04:52:03 -- common/autotest_common.sh@190 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:28.456 04:52:03 -- common/autotest_common.sh@190 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:08:28.456 04:52:03 -- common/autotest_common.sh@194 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:08:28.456 04:52:03 -- common/autotest_common.sh@195 -- # rm -rf /var/tmp/asan_suppression_file 00:08:28.456 04:52:03 -- common/autotest_common.sh@196 -- # cat 00:08:28.457 04:52:03 -- common/autotest_common.sh@222 -- # echo leak:libfuse3.so 00:08:28.457 04:52:03 -- common/autotest_common.sh@224 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:28.457 04:52:03 -- common/autotest_common.sh@224 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:08:28.457 04:52:03 -- common/autotest_common.sh@226 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:28.457 04:52:03 -- common/autotest_common.sh@226 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:08:28.457 04:52:03 -- common/autotest_common.sh@228 -- # '[' -z /var/spdk/dependencies ']' 00:08:28.457 04:52:03 -- common/autotest_common.sh@231 -- # export DEPENDENCY_DIR 00:08:28.457 04:52:03 -- common/autotest_common.sh@235 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.457 04:52:03 -- common/autotest_common.sh@235 -- # SPDK_BIN_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/bin 00:08:28.457 04:52:03 -- common/autotest_common.sh@236 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.457 04:52:03 -- common/autotest_common.sh@236 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/build/examples 00:08:28.457 04:52:03 -- common/autotest_common.sh@239 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:28.457 04:52:03 -- common/autotest_common.sh@239 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:08:28.457 04:52:03 -- common/autotest_common.sh@240 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:28.457 04:52:03 -- common/autotest_common.sh@240 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:08:28.457 04:52:03 -- common/autotest_common.sh@242 -- # export AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:28.457 04:52:03 -- common/autotest_common.sh@242 -- # AR_TOOL=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:08:28.457 04:52:03 -- common/autotest_common.sh@245 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:28.457 04:52:03 -- common/autotest_common.sh@245 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:08:28.457 04:52:03 -- common/autotest_common.sh@247 -- # _LCOV_MAIN=0 00:08:28.457 04:52:03 -- common/autotest_common.sh@248 -- # _LCOV_LLVM=1 00:08:28.457 04:52:03 -- common/autotest_common.sh@249 -- # _LCOV= 00:08:28.457 04:52:03 -- common/autotest_common.sh@250 -- # [[ '' == *clang* ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@250 -- # [[ 1 -eq 1 ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@250 -- # _LCOV=1 00:08:28.457 04:52:03 -- common/autotest_common.sh@252 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:28.457 04:52:03 -- common/autotest_common.sh@253 -- # _lcov_opt[_LCOV_MAIN]= 00:08:28.457 04:52:03 -- common/autotest_common.sh@255 -- # lcov_opt='--gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:08:28.457 04:52:03 -- common/autotest_common.sh@258 -- # '[' 0 -eq 0 ']' 00:08:28.457 04:52:03 -- common/autotest_common.sh@259 -- # export valgrind= 00:08:28.457 04:52:03 -- common/autotest_common.sh@259 -- # valgrind= 00:08:28.457 04:52:03 -- common/autotest_common.sh@265 -- # uname -s 00:08:28.457 04:52:03 -- common/autotest_common.sh@265 -- # '[' Linux = Linux ']' 00:08:28.457 04:52:03 -- common/autotest_common.sh@266 -- # HUGEMEM=4096 00:08:28.457 04:52:03 -- common/autotest_common.sh@267 -- # export CLEAR_HUGE=yes 00:08:28.457 04:52:03 -- common/autotest_common.sh@267 -- # CLEAR_HUGE=yes 00:08:28.457 04:52:03 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@268 -- # [[ 0 -eq 1 ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@275 -- # MAKE=make 00:08:28.457 04:52:03 -- common/autotest_common.sh@276 -- # MAKEFLAGS=-j112 00:08:28.457 04:52:03 -- common/autotest_common.sh@292 -- # export HUGEMEM=4096 00:08:28.457 04:52:03 -- common/autotest_common.sh@292 -- # HUGEMEM=4096 00:08:28.457 04:52:03 -- common/autotest_common.sh@294 -- # '[' -z /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output ']' 00:08:28.457 04:52:03 -- common/autotest_common.sh@299 -- # NO_HUGE=() 00:08:28.457 04:52:03 -- common/autotest_common.sh@300 -- # TEST_MODE= 00:08:28.457 04:52:03 -- common/autotest_common.sh@319 -- # [[ -z 3690154 ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@319 -- # kill -0 3690154 00:08:28.457 04:52:03 -- common/autotest_common.sh@1675 -- # set_test_storage 2147483648 00:08:28.457 04:52:03 -- common/autotest_common.sh@329 -- # [[ -v testdir ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@331 -- # local requested_size=2147483648 00:08:28.457 04:52:03 -- common/autotest_common.sh@332 -- # local mount target_dir 00:08:28.457 04:52:03 -- common/autotest_common.sh@334 -- # local -A mounts fss sizes avails uses 00:08:28.457 04:52:03 -- common/autotest_common.sh@335 -- # local source fs size avail mount use 00:08:28.457 04:52:03 -- common/autotest_common.sh@337 -- # local storage_fallback storage_candidates 00:08:28.457 04:52:03 -- common/autotest_common.sh@339 -- # mktemp -udt spdk.XXXXXX 00:08:28.457 04:52:03 -- common/autotest_common.sh@339 -- # storage_fallback=/tmp/spdk.FfrgBc 00:08:28.457 04:52:03 -- common/autotest_common.sh@344 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:08:28.457 04:52:03 -- common/autotest_common.sh@346 -- # [[ -n '' ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@351 -- # [[ -n '' ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@356 -- # mkdir -p /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio /tmp/spdk.FfrgBc/tests/vfio /tmp/spdk.FfrgBc 00:08:28.457 04:52:03 -- common/autotest_common.sh@359 -- # requested_size=2214592512 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@328 -- # df -T 00:08:28.457 04:52:03 -- common/autotest_common.sh@328 -- # grep -v Filesystem 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_devtmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=devtmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=67108864 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=67108864 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=0 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=/dev/pmem0 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=ext2 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=4096 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=5284429824 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=5284425728 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=spdk_root 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=overlay 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=54008008704 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=61730578432 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=7722569728 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=30862696448 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=2592768 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=12340117504 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=12346118144 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=6000640 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=30864465920 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=30865289216 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=823296 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # mounts["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@362 -- # fss["$mount"]=tmpfs 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # avails["$mount"]=6173044736 00:08:28.457 04:52:03 -- common/autotest_common.sh@363 -- # sizes["$mount"]=6173057024 00:08:28.457 04:52:03 -- common/autotest_common.sh@364 -- # uses["$mount"]=12288 00:08:28.457 04:52:03 -- common/autotest_common.sh@361 -- # read -r source fs size use avail _ mount 00:08:28.457 04:52:03 -- common/autotest_common.sh@367 -- # printf '* Looking for test storage...\n' 00:08:28.457 * Looking for test storage... 00:08:28.457 04:52:03 -- common/autotest_common.sh@369 -- # local target_space new_size 00:08:28.457 04:52:03 -- common/autotest_common.sh@370 -- # for target_dir in "${storage_candidates[@]}" 00:08:28.457 04:52:03 -- common/autotest_common.sh@373 -- # df /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.457 04:52:03 -- common/autotest_common.sh@373 -- # awk '$1 !~ /Filesystem/{print $6}' 00:08:28.457 04:52:03 -- common/autotest_common.sh@373 -- # mount=/ 00:08:28.457 04:52:03 -- common/autotest_common.sh@375 -- # target_space=54008008704 00:08:28.457 04:52:03 -- common/autotest_common.sh@376 -- # (( target_space == 0 || target_space < requested_size )) 00:08:28.457 04:52:03 -- common/autotest_common.sh@379 -- # (( target_space >= requested_size )) 00:08:28.457 04:52:03 -- common/autotest_common.sh@381 -- # [[ overlay == tmpfs ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@381 -- # [[ overlay == ramfs ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@381 -- # [[ / == / ]] 00:08:28.457 04:52:03 -- common/autotest_common.sh@382 -- # new_size=9937162240 00:08:28.457 04:52:03 -- common/autotest_common.sh@383 -- # (( new_size * 100 / sizes[/] > 95 )) 00:08:28.457 04:52:03 -- common/autotest_common.sh@388 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.457 04:52:03 -- common/autotest_common.sh@388 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.457 04:52:03 -- common/autotest_common.sh@389 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.457 * Found test storage at /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio 00:08:28.457 04:52:03 -- common/autotest_common.sh@390 -- # return 0 00:08:28.457 04:52:03 -- common/autotest_common.sh@1677 -- # set -o errtrace 00:08:28.457 04:52:03 -- common/autotest_common.sh@1678 -- # shopt -s extdebug 00:08:28.458 04:52:03 -- common/autotest_common.sh@1679 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:08:28.458 04:52:03 -- common/autotest_common.sh@1681 -- # PS4=' \t -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1682 -- # true 00:08:28.458 04:52:03 -- common/autotest_common.sh@1684 -- # xtrace_fd 00:08:28.458 04:52:03 -- common/autotest_common.sh@25 -- # [[ -n 14 ]] 00:08:28.458 04:52:03 -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/14 ]] 00:08:28.458 04:52:03 -- common/autotest_common.sh@27 -- # exec 00:08:28.458 04:52:03 -- common/autotest_common.sh@29 -- # exec 00:08:28.458 04:52:03 -- common/autotest_common.sh@31 -- # xtrace_restore 00:08:28.458 04:52:03 -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:08:28.458 04:52:03 -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:08:28.458 04:52:03 -- common/autotest_common.sh@18 -- # set -x 00:08:28.458 04:52:03 -- common/autotest_common.sh@1689 -- # [[ y == y ]] 00:08:28.458 04:52:03 -- common/autotest_common.sh@1690 -- # awk '{print $NF}' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1690 -- # lcov --version 00:08:28.458 04:52:03 -- common/autotest_common.sh@1690 -- # lt 1.15 2 00:08:28.458 04:52:03 -- scripts/common.sh@372 -- # cmp_versions 1.15 '<' 2 00:08:28.458 04:52:03 -- scripts/common.sh@332 -- # local ver1 ver1_l 00:08:28.458 04:52:03 -- scripts/common.sh@333 -- # local ver2 ver2_l 00:08:28.458 04:52:03 -- scripts/common.sh@335 -- # IFS=.-: 00:08:28.458 04:52:03 -- scripts/common.sh@335 -- # read -ra ver1 00:08:28.458 04:52:03 -- scripts/common.sh@336 -- # IFS=.-: 00:08:28.458 04:52:03 -- scripts/common.sh@336 -- # read -ra ver2 00:08:28.458 04:52:03 -- scripts/common.sh@337 -- # local 'op=<' 00:08:28.458 04:52:03 -- scripts/common.sh@339 -- # ver1_l=2 00:08:28.458 04:52:03 -- scripts/common.sh@340 -- # ver2_l=1 00:08:28.458 04:52:03 -- scripts/common.sh@342 -- # local lt=0 gt=0 eq=0 v 00:08:28.458 04:52:03 -- scripts/common.sh@343 -- # case "$op" in 00:08:28.458 04:52:03 -- scripts/common.sh@344 -- # : 1 00:08:28.458 04:52:03 -- scripts/common.sh@363 -- # (( v = 0 )) 00:08:28.458 04:52:03 -- scripts/common.sh@363 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:28.458 04:52:03 -- scripts/common.sh@364 -- # decimal 1 00:08:28.458 04:52:03 -- scripts/common.sh@352 -- # local d=1 00:08:28.458 04:52:03 -- scripts/common.sh@353 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:28.458 04:52:03 -- scripts/common.sh@354 -- # echo 1 00:08:28.458 04:52:03 -- scripts/common.sh@364 -- # ver1[v]=1 00:08:28.458 04:52:03 -- scripts/common.sh@365 -- # decimal 2 00:08:28.458 04:52:03 -- scripts/common.sh@352 -- # local d=2 00:08:28.458 04:52:03 -- scripts/common.sh@353 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:28.458 04:52:03 -- scripts/common.sh@354 -- # echo 2 00:08:28.458 04:52:03 -- scripts/common.sh@365 -- # ver2[v]=2 00:08:28.458 04:52:03 -- scripts/common.sh@366 -- # (( ver1[v] > ver2[v] )) 00:08:28.458 04:52:03 -- scripts/common.sh@367 -- # (( ver1[v] < ver2[v] )) 00:08:28.458 04:52:03 -- scripts/common.sh@367 -- # return 0 00:08:28.458 04:52:03 -- common/autotest_common.sh@1691 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1703 -- # export 'LCOV_OPTS= 00:08:28.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.458 --rc genhtml_branch_coverage=1 00:08:28.458 --rc genhtml_function_coverage=1 00:08:28.458 --rc genhtml_legend=1 00:08:28.458 --rc geninfo_all_blocks=1 00:08:28.458 --rc geninfo_unexecuted_blocks=1 00:08:28.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.458 ' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1703 -- # LCOV_OPTS=' 00:08:28.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.458 --rc genhtml_branch_coverage=1 00:08:28.458 --rc genhtml_function_coverage=1 00:08:28.458 --rc genhtml_legend=1 00:08:28.458 --rc geninfo_all_blocks=1 00:08:28.458 --rc geninfo_unexecuted_blocks=1 00:08:28.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.458 ' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1704 -- # export 'LCOV=lcov 00:08:28.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.458 --rc genhtml_branch_coverage=1 00:08:28.458 --rc genhtml_function_coverage=1 00:08:28.458 --rc genhtml_legend=1 00:08:28.458 --rc geninfo_all_blocks=1 00:08:28.458 --rc geninfo_unexecuted_blocks=1 00:08:28.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.458 ' 00:08:28.458 04:52:03 -- common/autotest_common.sh@1704 -- # LCOV='lcov 00:08:28.458 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:28.458 --rc genhtml_branch_coverage=1 00:08:28.458 --rc genhtml_function_coverage=1 00:08:28.458 --rc genhtml_legend=1 00:08:28.458 --rc geninfo_all_blocks=1 00:08:28.458 --rc geninfo_unexecuted_blocks=1 00:08:28.458 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:08:28.458 ' 00:08:28.458 04:52:03 -- vfio/run.sh@56 -- # source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/../common.sh 00:08:28.458 04:52:03 -- ../common.sh@8 -- # pids=() 00:08:28.458 04:52:03 -- vfio/run.sh@58 -- # fuzzfile=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:28.458 04:52:03 -- vfio/run.sh@59 -- # grep -c '\.fn =' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c 00:08:28.458 04:52:03 -- vfio/run.sh@59 -- # fuzz_num=7 00:08:28.458 04:52:03 -- vfio/run.sh@60 -- # (( fuzz_num != 0 )) 00:08:28.458 04:52:03 -- vfio/run.sh@62 -- # trap 'cleanup /tmp/vfio-user-*; exit 1' SIGINT SIGTERM EXIT 00:08:28.458 04:52:03 -- vfio/run.sh@65 -- # mem_size=0 00:08:28.458 04:52:03 -- vfio/run.sh@66 -- # [[ 1 -eq 1 ]] 00:08:28.458 04:52:03 -- vfio/run.sh@67 -- # start_llvm_fuzz_short 7 1 00:08:28.458 04:52:03 -- ../common.sh@69 -- # local fuzz_num=7 00:08:28.458 04:52:03 -- ../common.sh@70 -- # local time=1 00:08:28.458 04:52:03 -- ../common.sh@72 -- # (( i = 0 )) 00:08:28.458 04:52:03 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:28.458 04:52:03 -- ../common.sh@73 -- # start_llvm_fuzz 0 1 0x1 00:08:28.458 04:52:03 -- vfio/run.sh@22 -- # local fuzzer_type=0 00:08:28.458 04:52:03 -- vfio/run.sh@23 -- # local timen=1 00:08:28.458 04:52:03 -- vfio/run.sh@24 -- # local core=0x1 00:08:28.458 04:52:03 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:28.458 04:52:03 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-0 00:08:28.458 04:52:03 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-0/domain/1 00:08:28.458 04:52:03 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-0/domain/2 00:08:28.458 04:52:03 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-0/fuzz_vfio_json.conf 00:08:28.458 04:52:03 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-0 /tmp/vfio-user-0/domain/1 /tmp/vfio-user-0/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:28.458 04:52:03 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-0/domain/1%; 00:08:28.458 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-0/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:28.458 04:52:03 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-0/domain/1 -c /tmp/vfio-user-0/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 -Y /tmp/vfio-user-0/domain/2 -r /tmp/vfio-user-0/spdk0.sock -Z 0 00:08:28.458 [2024-11-08 04:52:03.509335] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:28.458 [2024-11-08 04:52:03.509403] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690220 ] 00:08:28.458 EAL: No free 2048 kB hugepages reported on node 1 00:08:28.717 [2024-11-08 04:52:03.580962] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.717 [2024-11-08 04:52:03.650187] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:28.717 [2024-11-08 04:52:03.650337] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.977 INFO: Running with entropic power schedule (0xFF, 100). 00:08:28.977 INFO: Seed: 4277195767 00:08:28.977 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:28.977 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:28.977 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_0 00:08:28.977 INFO: A corpus is not provided, starting from an empty corpus 00:08:28.977 #2 INITED exec/s: 0 rss: 62Mb 00:08:28.977 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:28.977 This may also happen if the target rejected all inputs we tried so far 00:08:29.236 NEW_FUNC[1/631]: 0x43a218 in fuzz_vfio_user_region_rw /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:85 00:08:29.236 NEW_FUNC[2/631]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:29.236 #6 NEW cov: 10762 ft: 10324 corp: 2/42b lim: 60 exec/s: 0 rss: 67Mb L: 41/41 MS: 4 ShuffleBytes-CrossOver-ChangeByte-InsertRepeatedBytes- 00:08:29.495 #7 NEW cov: 10779 ft: 13062 corp: 3/84b lim: 60 exec/s: 0 rss: 69Mb L: 42/42 MS: 1 InsertByte- 00:08:29.495 #8 NEW cov: 10779 ft: 14724 corp: 4/121b lim: 60 exec/s: 0 rss: 70Mb L: 37/42 MS: 1 EraseBytes- 00:08:29.754 #9 NEW cov: 10779 ft: 15102 corp: 5/178b lim: 60 exec/s: 0 rss: 70Mb L: 57/57 MS: 1 InsertRepeatedBytes- 00:08:29.754 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:29.754 #10 NEW cov: 10796 ft: 15550 corp: 6/220b lim: 60 exec/s: 0 rss: 70Mb L: 42/57 MS: 1 ShuffleBytes- 00:08:29.754 #11 NEW cov: 10796 ft: 16084 corp: 7/279b lim: 60 exec/s: 11 rss: 70Mb L: 59/59 MS: 1 CopyPart- 00:08:30.013 #12 NEW cov: 10796 ft: 16199 corp: 8/322b lim: 60 exec/s: 12 rss: 70Mb L: 43/59 MS: 1 InsertByte- 00:08:30.013 #13 NEW cov: 10796 ft: 16249 corp: 9/381b lim: 60 exec/s: 13 rss: 70Mb L: 59/59 MS: 1 ChangeBinInt- 00:08:30.272 #14 NEW cov: 10796 ft: 16285 corp: 10/423b lim: 60 exec/s: 14 rss: 70Mb L: 42/59 MS: 1 ChangeBinInt- 00:08:30.272 #15 NEW cov: 10796 ft: 16352 corp: 11/465b lim: 60 exec/s: 15 rss: 70Mb L: 42/59 MS: 1 ChangeBinInt- 00:08:30.531 #21 NEW cov: 10796 ft: 16370 corp: 12/524b lim: 60 exec/s: 21 rss: 70Mb L: 59/59 MS: 1 ChangeBit- 00:08:30.531 #22 NEW cov: 10796 ft: 16479 corp: 13/582b lim: 60 exec/s: 22 rss: 70Mb L: 58/59 MS: 1 InsertByte- 00:08:30.790 #28 NEW cov: 10803 ft: 16776 corp: 14/619b lim: 60 exec/s: 28 rss: 70Mb L: 37/59 MS: 1 CMP- DE: "\003\000\000\000\000\000\000\000"- 00:08:30.790 #29 NEW cov: 10803 ft: 17021 corp: 15/661b lim: 60 exec/s: 29 rss: 70Mb L: 42/59 MS: 1 ShuffleBytes- 00:08:31.049 #30 NEW cov: 10803 ft: 17086 corp: 16/706b lim: 60 exec/s: 15 rss: 70Mb L: 45/59 MS: 1 CMP- DE: "\001\020"- 00:08:31.049 #30 DONE cov: 10803 ft: 17086 corp: 16/706b lim: 60 exec/s: 15 rss: 70Mb 00:08:31.049 ###### Recommended dictionary. ###### 00:08:31.049 "\003\000\000\000\000\000\000\000" # Uses: 0 00:08:31.049 "\001\020" # Uses: 0 00:08:31.049 ###### End of recommended dictionary. ###### 00:08:31.049 Done 30 runs in 2 second(s) 00:08:31.309 04:52:06 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-0 00:08:31.309 04:52:06 -- ../common.sh@72 -- # (( i++ )) 00:08:31.309 04:52:06 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:31.309 04:52:06 -- ../common.sh@73 -- # start_llvm_fuzz 1 1 0x1 00:08:31.309 04:52:06 -- vfio/run.sh@22 -- # local fuzzer_type=1 00:08:31.309 04:52:06 -- vfio/run.sh@23 -- # local timen=1 00:08:31.309 04:52:06 -- vfio/run.sh@24 -- # local core=0x1 00:08:31.309 04:52:06 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:31.309 04:52:06 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-1 00:08:31.309 04:52:06 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-1/domain/1 00:08:31.309 04:52:06 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-1/domain/2 00:08:31.309 04:52:06 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-1/fuzz_vfio_json.conf 00:08:31.309 04:52:06 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-1 /tmp/vfio-user-1/domain/1 /tmp/vfio-user-1/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:31.309 04:52:06 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-1/domain/1%; 00:08:31.309 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-1/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:31.309 04:52:06 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-1/domain/1 -c /tmp/vfio-user-1/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 -Y /tmp/vfio-user-1/domain/2 -r /tmp/vfio-user-1/spdk1.sock -Z 1 00:08:31.309 [2024-11-08 04:52:06.220540] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:31.309 [2024-11-08 04:52:06.220632] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3690759 ] 00:08:31.309 EAL: No free 2048 kB hugepages reported on node 1 00:08:31.309 [2024-11-08 04:52:06.291387] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.309 [2024-11-08 04:52:06.362235] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:31.309 [2024-11-08 04:52:06.362371] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.569 INFO: Running with entropic power schedule (0xFF, 100). 00:08:31.569 INFO: Seed: 2693246700 00:08:31.569 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:31.569 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:31.569 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_1 00:08:31.569 INFO: A corpus is not provided, starting from an empty corpus 00:08:31.569 #2 INITED exec/s: 0 rss: 61Mb 00:08:31.569 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:31.569 This may also happen if the target rejected all inputs we tried so far 00:08:31.569 [2024-11-08 04:52:06.648555] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:31.569 [2024-11-08 04:52:06.648590] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:31.569 [2024-11-08 04:52:06.648607] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.087 NEW_FUNC[1/638]: 0x43a7b8 in fuzz_vfio_user_version /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:72 00:08:32.087 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:32.087 #6 NEW cov: 10779 ft: 10727 corp: 2/13b lim: 40 exec/s: 0 rss: 68Mb L: 12/12 MS: 4 ChangeByte-ChangeASCIIInt-CopyPart-InsertRepeatedBytes- 00:08:32.087 [2024-11-08 04:52:07.110368] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.087 [2024-11-08 04:52:07.110403] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.087 [2024-11-08 04:52:07.110422] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.345 #12 NEW cov: 10796 ft: 13383 corp: 3/53b lim: 40 exec/s: 0 rss: 69Mb L: 40/40 MS: 1 InsertRepeatedBytes- 00:08:32.345 [2024-11-08 04:52:07.312518] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.345 [2024-11-08 04:52:07.312548] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.345 [2024-11-08 04:52:07.312565] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.345 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:32.345 #13 NEW cov: 10813 ft: 13970 corp: 4/93b lim: 40 exec/s: 0 rss: 70Mb L: 40/40 MS: 1 ChangeByte- 00:08:32.604 [2024-11-08 04:52:07.502360] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.604 [2024-11-08 04:52:07.502383] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.604 [2024-11-08 04:52:07.502399] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.604 #14 NEW cov: 10813 ft: 14626 corp: 5/105b lim: 40 exec/s: 14 rss: 70Mb L: 12/40 MS: 1 ChangeByte- 00:08:32.604 [2024-11-08 04:52:07.694175] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.604 [2024-11-08 04:52:07.694197] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.604 [2024-11-08 04:52:07.694215] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:32.862 #15 NEW cov: 10813 ft: 15609 corp: 6/117b lim: 40 exec/s: 15 rss: 70Mb L: 12/40 MS: 1 CMP- DE: "9b\3254\377\177\000\000"- 00:08:32.862 [2024-11-08 04:52:07.884852] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:32.862 [2024-11-08 04:52:07.884875] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:32.862 [2024-11-08 04:52:07.884892] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.120 #16 NEW cov: 10813 ft: 15724 corp: 7/129b lim: 40 exec/s: 16 rss: 70Mb L: 12/40 MS: 1 ChangeBit- 00:08:33.120 [2024-11-08 04:52:08.074391] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.120 [2024-11-08 04:52:08.074413] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.120 [2024-11-08 04:52:08.074430] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.120 #17 NEW cov: 10813 ft: 15845 corp: 8/141b lim: 40 exec/s: 17 rss: 70Mb L: 12/40 MS: 1 CopyPart- 00:08:33.379 [2024-11-08 04:52:08.265602] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.379 [2024-11-08 04:52:08.265623] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.379 [2024-11-08 04:52:08.265640] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.379 #18 NEW cov: 10820 ft: 15967 corp: 9/168b lim: 40 exec/s: 18 rss: 70Mb L: 27/40 MS: 1 InsertRepeatedBytes- 00:08:33.379 [2024-11-08 04:52:08.457085] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: bad command 1 00:08:33.379 [2024-11-08 04:52:08.457107] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-1/domain/1: msg0: cmd 1 failed: Invalid argument 00:08:33.379 [2024-11-08 04:52:08.457125] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 1 return failure 00:08:33.641 #19 NEW cov: 10820 ft: 16036 corp: 10/194b lim: 40 exec/s: 9 rss: 70Mb L: 26/40 MS: 1 InsertRepeatedBytes- 00:08:33.641 #19 DONE cov: 10820 ft: 16036 corp: 10/194b lim: 40 exec/s: 9 rss: 70Mb 00:08:33.641 ###### Recommended dictionary. ###### 00:08:33.641 "9b\3254\377\177\000\000" # Uses: 0 00:08:33.641 ###### End of recommended dictionary. ###### 00:08:33.641 Done 19 runs in 2 second(s) 00:08:33.970 04:52:08 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-1 00:08:33.970 04:52:08 -- ../common.sh@72 -- # (( i++ )) 00:08:33.970 04:52:08 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:33.970 04:52:08 -- ../common.sh@73 -- # start_llvm_fuzz 2 1 0x1 00:08:33.970 04:52:08 -- vfio/run.sh@22 -- # local fuzzer_type=2 00:08:33.970 04:52:08 -- vfio/run.sh@23 -- # local timen=1 00:08:33.970 04:52:08 -- vfio/run.sh@24 -- # local core=0x1 00:08:33.970 04:52:08 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:33.970 04:52:08 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-2 00:08:33.970 04:52:08 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-2/domain/1 00:08:33.970 04:52:08 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-2/domain/2 00:08:33.970 04:52:08 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-2/fuzz_vfio_json.conf 00:08:33.970 04:52:08 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-2 /tmp/vfio-user-2/domain/1 /tmp/vfio-user-2/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:33.970 04:52:08 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-2/domain/1%; 00:08:33.970 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-2/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:33.970 04:52:08 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-2/domain/1 -c /tmp/vfio-user-2/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 -Y /tmp/vfio-user-2/domain/2 -r /tmp/vfio-user-2/spdk2.sock -Z 2 00:08:33.970 [2024-11-08 04:52:08.868620] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:33.970 [2024-11-08 04:52:08.868711] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3691176 ] 00:08:33.970 EAL: No free 2048 kB hugepages reported on node 1 00:08:33.970 [2024-11-08 04:52:08.941027] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:33.970 [2024-11-08 04:52:09.011270] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:33.970 [2024-11-08 04:52:09.011407] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.229 INFO: Running with entropic power schedule (0xFF, 100). 00:08:34.229 INFO: Seed: 1054288771 00:08:34.229 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:34.229 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:34.229 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_2 00:08:34.229 INFO: A corpus is not provided, starting from an empty corpus 00:08:34.229 #2 INITED exec/s: 0 rss: 62Mb 00:08:34.229 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:34.229 This may also happen if the target rejected all inputs we tried so far 00:08:34.229 [2024-11-08 04:52:09.301877] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:34.745 NEW_FUNC[1/636]: 0x43b1a8 in fuzz_vfio_user_get_region_info /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:104 00:08:34.745 NEW_FUNC[2/636]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:34.745 #3 NEW cov: 10757 ft: 10602 corp: 2/35b lim: 80 exec/s: 0 rss: 67Mb L: 34/34 MS: 1 InsertRepeatedBytes- 00:08:34.745 [2024-11-08 04:52:09.778920] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.008 #9 NEW cov: 10776 ft: 13217 corp: 3/69b lim: 80 exec/s: 0 rss: 69Mb L: 34/34 MS: 1 ChangeBit- 00:08:35.008 [2024-11-08 04:52:09.970286] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.008 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:35.008 #10 NEW cov: 10793 ft: 14984 corp: 4/105b lim: 80 exec/s: 0 rss: 70Mb L: 36/36 MS: 1 InsertRepeatedBytes- 00:08:35.337 [2024-11-08 04:52:10.194346] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.337 #11 NEW cov: 10793 ft: 15444 corp: 5/139b lim: 80 exec/s: 11 rss: 70Mb L: 34/36 MS: 1 ShuffleBytes- 00:08:35.337 [2024-11-08 04:52:10.387290] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.596 #14 NEW cov: 10793 ft: 15981 corp: 6/174b lim: 80 exec/s: 14 rss: 70Mb L: 35/36 MS: 3 ChangeByte-ChangeByte-CrossOver- 00:08:35.596 [2024-11-08 04:52:10.579428] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.596 #15 NEW cov: 10793 ft: 16371 corp: 7/210b lim: 80 exec/s: 15 rss: 70Mb L: 36/36 MS: 1 ShuffleBytes- 00:08:35.855 [2024-11-08 04:52:10.771801] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:35.855 #16 NEW cov: 10793 ft: 16434 corp: 8/263b lim: 80 exec/s: 16 rss: 70Mb L: 53/53 MS: 1 CrossOver- 00:08:35.855 [2024-11-08 04:52:10.961673] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.114 #17 NEW cov: 10800 ft: 16584 corp: 9/299b lim: 80 exec/s: 17 rss: 70Mb L: 36/53 MS: 1 ChangeByte- 00:08:36.114 [2024-11-08 04:52:11.152471] vfio_user.c: 170:vfio_user_dev_send_request: *ERROR*: Oversized argument length, command 5 00:08:36.373 #18 NEW cov: 10800 ft: 16705 corp: 10/338b lim: 80 exec/s: 9 rss: 70Mb L: 39/53 MS: 1 InsertRepeatedBytes- 00:08:36.373 #18 DONE cov: 10800 ft: 16705 corp: 10/338b lim: 80 exec/s: 9 rss: 70Mb 00:08:36.373 Done 18 runs in 2 second(s) 00:08:36.633 04:52:11 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-2 00:08:36.633 04:52:11 -- ../common.sh@72 -- # (( i++ )) 00:08:36.633 04:52:11 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:36.633 04:52:11 -- ../common.sh@73 -- # start_llvm_fuzz 3 1 0x1 00:08:36.633 04:52:11 -- vfio/run.sh@22 -- # local fuzzer_type=3 00:08:36.633 04:52:11 -- vfio/run.sh@23 -- # local timen=1 00:08:36.633 04:52:11 -- vfio/run.sh@24 -- # local core=0x1 00:08:36.633 04:52:11 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:36.633 04:52:11 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-3 00:08:36.633 04:52:11 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-3/domain/1 00:08:36.633 04:52:11 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-3/domain/2 00:08:36.633 04:52:11 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-3/fuzz_vfio_json.conf 00:08:36.633 04:52:11 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-3 /tmp/vfio-user-3/domain/1 /tmp/vfio-user-3/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:36.633 04:52:11 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-3/domain/1%; 00:08:36.633 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-3/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:36.633 04:52:11 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-3/domain/1 -c /tmp/vfio-user-3/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 -Y /tmp/vfio-user-3/domain/2 -r /tmp/vfio-user-3/spdk3.sock -Z 3 00:08:36.633 [2024-11-08 04:52:11.561996] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:36.633 [2024-11-08 04:52:11.562049] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3691605 ] 00:08:36.633 EAL: No free 2048 kB hugepages reported on node 1 00:08:36.633 [2024-11-08 04:52:11.626310] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.633 [2024-11-08 04:52:11.696234] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:36.633 [2024-11-08 04:52:11.696386] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.892 INFO: Running with entropic power schedule (0xFF, 100). 00:08:36.892 INFO: Seed: 3737277943 00:08:36.892 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:36.892 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:36.892 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_3 00:08:36.892 INFO: A corpus is not provided, starting from an empty corpus 00:08:36.892 #2 INITED exec/s: 0 rss: 62Mb 00:08:36.892 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:36.892 This may also happen if the target rejected all inputs we tried so far 00:08:37.410 NEW_FUNC[1/632]: 0x43b898 in fuzz_vfio_user_dma_map /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:125 00:08:37.410 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:37.410 #14 NEW cov: 10750 ft: 10688 corp: 2/113b lim: 320 exec/s: 0 rss: 67Mb L: 112/112 MS: 2 ShuffleBytes-InsertRepeatedBytes- 00:08:37.670 #15 NEW cov: 10766 ft: 13623 corp: 3/225b lim: 320 exec/s: 0 rss: 69Mb L: 112/112 MS: 1 ChangeBinInt- 00:08:37.670 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:37.670 #16 NEW cov: 10783 ft: 14191 corp: 4/407b lim: 320 exec/s: 0 rss: 70Mb L: 182/182 MS: 1 InsertRepeatedBytes- 00:08:37.928 #17 NEW cov: 10783 ft: 14511 corp: 5/584b lim: 320 exec/s: 17 rss: 70Mb L: 177/182 MS: 1 EraseBytes- 00:08:38.187 #18 NEW cov: 10783 ft: 15213 corp: 6/674b lim: 320 exec/s: 18 rss: 70Mb L: 90/182 MS: 1 InsertRepeatedBytes- 00:08:38.446 #19 NEW cov: 10783 ft: 15365 corp: 7/856b lim: 320 exec/s: 19 rss: 70Mb L: 182/182 MS: 1 ChangeBinInt- 00:08:38.446 #20 NEW cov: 10783 ft: 15450 corp: 8/946b lim: 320 exec/s: 20 rss: 70Mb L: 90/182 MS: 1 CrossOver- 00:08:38.705 #21 NEW cov: 10790 ft: 16327 corp: 9/1036b lim: 320 exec/s: 21 rss: 70Mb L: 90/182 MS: 1 CrossOver- 00:08:38.964 #22 NEW cov: 10790 ft: 16837 corp: 10/1218b lim: 320 exec/s: 11 rss: 70Mb L: 182/182 MS: 1 CopyPart- 00:08:38.964 #22 DONE cov: 10790 ft: 16837 corp: 10/1218b lim: 320 exec/s: 11 rss: 70Mb 00:08:38.964 Done 22 runs in 2 second(s) 00:08:39.224 04:52:14 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-3 00:08:39.224 04:52:14 -- ../common.sh@72 -- # (( i++ )) 00:08:39.224 04:52:14 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:39.224 04:52:14 -- ../common.sh@73 -- # start_llvm_fuzz 4 1 0x1 00:08:39.224 04:52:14 -- vfio/run.sh@22 -- # local fuzzer_type=4 00:08:39.224 04:52:14 -- vfio/run.sh@23 -- # local timen=1 00:08:39.224 04:52:14 -- vfio/run.sh@24 -- # local core=0x1 00:08:39.224 04:52:14 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:39.224 04:52:14 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-4 00:08:39.224 04:52:14 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-4/domain/1 00:08:39.224 04:52:14 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-4/domain/2 00:08:39.224 04:52:14 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-4/fuzz_vfio_json.conf 00:08:39.224 04:52:14 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-4 /tmp/vfio-user-4/domain/1 /tmp/vfio-user-4/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:39.224 04:52:14 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-4/domain/1%; 00:08:39.224 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-4/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:39.224 04:52:14 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-4/domain/1 -c /tmp/vfio-user-4/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 -Y /tmp/vfio-user-4/domain/2 -r /tmp/vfio-user-4/spdk4.sock -Z 4 00:08:39.224 [2024-11-08 04:52:14.220069] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:39.224 [2024-11-08 04:52:14.220162] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3692143 ] 00:08:39.224 EAL: No free 2048 kB hugepages reported on node 1 00:08:39.224 [2024-11-08 04:52:14.292264] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.483 [2024-11-08 04:52:14.361460] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:39.483 [2024-11-08 04:52:14.361615] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:39.483 INFO: Running with entropic power schedule (0xFF, 100). 00:08:39.483 INFO: Seed: 2108307063 00:08:39.483 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:39.483 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:39.483 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_4 00:08:39.483 INFO: A corpus is not provided, starting from an empty corpus 00:08:39.483 #2 INITED exec/s: 0 rss: 61Mb 00:08:39.483 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:39.483 This may also happen if the target rejected all inputs we tried so far 00:08:40.000 NEW_FUNC[1/632]: 0x43c118 in fuzz_vfio_user_dma_unmap /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:145 00:08:40.000 NEW_FUNC[2/632]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:40.000 #10 NEW cov: 10750 ft: 10683 corp: 2/94b lim: 320 exec/s: 0 rss: 67Mb L: 93/93 MS: 3 InsertByte-CMP-InsertRepeatedBytes- DE: "\000\000\000\005"- 00:08:40.259 #19 NEW cov: 10764 ft: 13326 corp: 3/152b lim: 320 exec/s: 0 rss: 69Mb L: 58/93 MS: 4 PersAutoDict-ShuffleBytes-EraseBytes-InsertRepeatedBytes- DE: "\000\000\000\005"- 00:08:40.518 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:40.518 #20 NEW cov: 10781 ft: 14689 corp: 4/245b lim: 320 exec/s: 0 rss: 70Mb L: 93/93 MS: 1 ChangeBinInt- 00:08:40.518 #21 NEW cov: 10781 ft: 16043 corp: 5/344b lim: 320 exec/s: 21 rss: 70Mb L: 99/99 MS: 1 CrossOver- 00:08:40.777 #26 NEW cov: 10781 ft: 16576 corp: 6/409b lim: 320 exec/s: 26 rss: 70Mb L: 65/99 MS: 5 ChangeByte-PersAutoDict-ChangeBit-ChangeBit-InsertRepeatedBytes- DE: "\000\000\000\005"- 00:08:41.036 #27 NEW cov: 10781 ft: 16775 corp: 7/453b lim: 320 exec/s: 27 rss: 70Mb L: 44/99 MS: 1 EraseBytes- 00:08:41.295 #28 NEW cov: 10781 ft: 17019 corp: 8/542b lim: 320 exec/s: 28 rss: 70Mb L: 89/99 MS: 1 InsertRepeatedBytes- 00:08:41.295 #29 NEW cov: 10781 ft: 17216 corp: 9/641b lim: 320 exec/s: 29 rss: 70Mb L: 99/99 MS: 1 ChangeBit- 00:08:41.554 #30 NEW cov: 10788 ft: 17298 corp: 10/740b lim: 320 exec/s: 30 rss: 70Mb L: 99/99 MS: 1 ChangeBit- 00:08:41.813 #31 NEW cov: 10788 ft: 17565 corp: 11/860b lim: 320 exec/s: 15 rss: 70Mb L: 120/120 MS: 1 InsertRepeatedBytes- 00:08:41.813 #31 DONE cov: 10788 ft: 17565 corp: 11/860b lim: 320 exec/s: 15 rss: 70Mb 00:08:41.813 ###### Recommended dictionary. ###### 00:08:41.813 "\000\000\000\005" # Uses: 2 00:08:41.813 ###### End of recommended dictionary. ###### 00:08:41.813 Done 31 runs in 2 second(s) 00:08:42.072 04:52:16 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-4 00:08:42.072 04:52:16 -- ../common.sh@72 -- # (( i++ )) 00:08:42.072 04:52:16 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:42.072 04:52:16 -- ../common.sh@73 -- # start_llvm_fuzz 5 1 0x1 00:08:42.072 04:52:16 -- vfio/run.sh@22 -- # local fuzzer_type=5 00:08:42.072 04:52:16 -- vfio/run.sh@23 -- # local timen=1 00:08:42.072 04:52:16 -- vfio/run.sh@24 -- # local core=0x1 00:08:42.072 04:52:16 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:42.072 04:52:16 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-5 00:08:42.072 04:52:16 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-5/domain/1 00:08:42.072 04:52:16 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-5/domain/2 00:08:42.072 04:52:16 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-5/fuzz_vfio_json.conf 00:08:42.072 04:52:16 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-5 /tmp/vfio-user-5/domain/1 /tmp/vfio-user-5/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:42.072 04:52:16 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-5/domain/1%; 00:08:42.072 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-5/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:42.073 04:52:16 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-5/domain/1 -c /tmp/vfio-user-5/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 -Y /tmp/vfio-user-5/domain/2 -r /tmp/vfio-user-5/spdk5.sock -Z 5 00:08:42.073 [2024-11-08 04:52:16.999499] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:42.073 [2024-11-08 04:52:16.999574] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3692693 ] 00:08:42.073 EAL: No free 2048 kB hugepages reported on node 1 00:08:42.073 [2024-11-08 04:52:17.063377] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.073 [2024-11-08 04:52:17.132152] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:42.073 [2024-11-08 04:52:17.132302] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:42.332 INFO: Running with entropic power schedule (0xFF, 100). 00:08:42.332 INFO: Seed: 579345038 00:08:42.332 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:42.332 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:42.332 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_5 00:08:42.332 INFO: A corpus is not provided, starting from an empty corpus 00:08:42.332 #2 INITED exec/s: 0 rss: 62Mb 00:08:42.332 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:42.332 This may also happen if the target rejected all inputs we tried so far 00:08:42.332 [2024-11-08 04:52:17.420627] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:42.332 [2024-11-08 04:52:17.420673] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:42.849 NEW_FUNC[1/637]: 0x43cb18 in fuzz_vfio_user_irq_set /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:172 00:08:42.849 NEW_FUNC[2/637]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:42.849 #3 NEW cov: 10688 ft: 10753 corp: 2/112b lim: 120 exec/s: 0 rss: 67Mb L: 111/111 MS: 1 InsertRepeatedBytes- 00:08:42.849 [2024-11-08 04:52:17.884839] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:42.849 [2024-11-08 04:52:17.884880] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.109 NEW_FUNC[1/1]: 0x45f948 in bdev_malloc_writev /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/module/bdev/malloc/bdev_malloc.c:357 00:08:43.109 #9 NEW cov: 10795 ft: 13271 corp: 3/223b lim: 120 exec/s: 0 rss: 69Mb L: 111/111 MS: 1 ChangeByte- 00:08:43.109 [2024-11-08 04:52:18.083325] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.109 [2024-11-08 04:52:18.083357] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.109 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:43.109 #15 NEW cov: 10812 ft: 15517 corp: 4/334b lim: 120 exec/s: 0 rss: 70Mb L: 111/111 MS: 1 ChangeBinInt- 00:08:43.368 [2024-11-08 04:52:18.292070] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.368 [2024-11-08 04:52:18.292100] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.368 #16 NEW cov: 10815 ft: 15736 corp: 5/446b lim: 120 exec/s: 16 rss: 70Mb L: 112/112 MS: 1 InsertByte- 00:08:43.626 [2024-11-08 04:52:18.489316] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.626 [2024-11-08 04:52:18.489346] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.626 #17 NEW cov: 10815 ft: 16061 corp: 6/557b lim: 120 exec/s: 17 rss: 70Mb L: 111/112 MS: 1 ChangeBinInt- 00:08:43.626 [2024-11-08 04:52:18.684156] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.626 [2024-11-08 04:52:18.684184] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:43.885 #18 NEW cov: 10815 ft: 16234 corp: 7/673b lim: 120 exec/s: 18 rss: 70Mb L: 116/116 MS: 1 InsertRepeatedBytes- 00:08:43.885 [2024-11-08 04:52:18.882163] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:43.885 [2024-11-08 04:52:18.882197] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.143 #19 NEW cov: 10815 ft: 16502 corp: 8/784b lim: 120 exec/s: 19 rss: 70Mb L: 111/116 MS: 1 ChangeByte- 00:08:44.143 [2024-11-08 04:52:19.074910] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.143 [2024-11-08 04:52:19.074940] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.143 #20 NEW cov: 10822 ft: 16632 corp: 9/895b lim: 120 exec/s: 20 rss: 70Mb L: 111/116 MS: 1 ShuffleBytes- 00:08:44.402 [2024-11-08 04:52:19.268991] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-5/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:44.402 [2024-11-08 04:52:19.269021] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:44.402 #21 NEW cov: 10822 ft: 16809 corp: 10/1006b lim: 120 exec/s: 10 rss: 70Mb L: 111/116 MS: 1 ChangeBinInt- 00:08:44.402 #21 DONE cov: 10822 ft: 16809 corp: 10/1006b lim: 120 exec/s: 10 rss: 70Mb 00:08:44.402 Done 21 runs in 2 second(s) 00:08:44.662 04:52:19 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-5 00:08:44.662 04:52:19 -- ../common.sh@72 -- # (( i++ )) 00:08:44.662 04:52:19 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:44.662 04:52:19 -- ../common.sh@73 -- # start_llvm_fuzz 6 1 0x1 00:08:44.662 04:52:19 -- vfio/run.sh@22 -- # local fuzzer_type=6 00:08:44.662 04:52:19 -- vfio/run.sh@23 -- # local timen=1 00:08:44.662 04:52:19 -- vfio/run.sh@24 -- # local core=0x1 00:08:44.662 04:52:19 -- vfio/run.sh@25 -- # local corpus_dir=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:44.662 04:52:19 -- vfio/run.sh@26 -- # local fuzzer_dir=/tmp/vfio-user-6 00:08:44.662 04:52:19 -- vfio/run.sh@27 -- # local vfiouser_dir=/tmp/vfio-user-6/domain/1 00:08:44.662 04:52:19 -- vfio/run.sh@28 -- # local vfiouser_io_dir=/tmp/vfio-user-6/domain/2 00:08:44.662 04:52:19 -- vfio/run.sh@29 -- # local vfiouser_cfg=/tmp/vfio-user-6/fuzz_vfio_json.conf 00:08:44.662 04:52:19 -- vfio/run.sh@31 -- # mkdir -p /tmp/vfio-user-6 /tmp/vfio-user-6/domain/1 /tmp/vfio-user-6/domain/2 /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:44.662 04:52:19 -- vfio/run.sh@34 -- # sed -e 's%/tmp/vfio-user/domain/1%/tmp/vfio-user-6/domain/1%; 00:08:44.662 s%/tmp/vfio-user/domain/2%/tmp/vfio-user-6/domain/2%' /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/vfio/fuzz_vfio_json.conf 00:08:44.662 04:52:19 -- vfio/run.sh@38 -- # /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz -m 0x1 -s 0 -P /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/llvm/ -F /tmp/vfio-user-6/domain/1 -c /tmp/vfio-user-6/fuzz_vfio_json.conf -t 1 -D /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 -Y /tmp/vfio-user-6/domain/2 -r /tmp/vfio-user-6/spdk6.sock -Z 6 00:08:44.662 [2024-11-08 04:52:19.691411] Starting SPDK v24.01.1-pre git sha1 c13c99a5e / DPDK 23.11.0 initialization... 00:08:44.662 [2024-11-08 04:52:19.691481] [ DPDK EAL parameters: vfio_fuzz --no-shconf -c 0x1 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid3693087 ] 00:08:44.662 EAL: No free 2048 kB hugepages reported on node 1 00:08:44.662 [2024-11-08 04:52:19.762966] app.c: 798:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.921 [2024-11-08 04:52:19.832929] trace_flags.c: 278:trace_register_description: *ERROR*: name (RDMA_REQ_RDY_TO_COMPL_PEND) too long 00:08:44.921 [2024-11-08 04:52:19.833080] reactor.c: 937:reactor_run: *NOTICE*: Reactor started on core 0 00:08:44.921 INFO: Running with entropic power schedule (0xFF, 100). 00:08:44.921 INFO: Seed: 3288350335 00:08:45.180 INFO: Loaded 1 modules (341891 inline 8-bit counters): 341891 [0x27db80c, 0x282ef8f), 00:08:45.180 INFO: Loaded 1 PC tables (341891 PCs): 341891 [0x282ef90,0x2d667c0), 00:08:45.180 INFO: 0 files found in /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../corpus/llvm_vfio_6 00:08:45.180 INFO: A corpus is not provided, starting from an empty corpus 00:08:45.180 #2 INITED exec/s: 0 rss: 62Mb 00:08:45.180 WARNING: no interesting inputs were found so far. Is the code instrumented for coverage? 00:08:45.180 This may also happen if the target rejected all inputs we tried so far 00:08:45.180 [2024-11-08 04:52:20.161455] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.180 [2024-11-08 04:52:20.161506] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.747 NEW_FUNC[1/638]: 0x43d808 in fuzz_vfio_user_set_msix /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:190 00:08:45.747 NEW_FUNC[2/638]: 0x43fdb8 in TestOneInput /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/app/fuzz/llvm_vfio_fuzz/llvm_vfio_fuzz.c:220 00:08:45.747 #11 NEW cov: 10773 ft: 10727 corp: 2/53b lim: 90 exec/s: 0 rss: 67Mb L: 52/52 MS: 4 ChangeByte-InsertByte-ShuffleBytes-InsertRepeatedBytes- 00:08:45.747 [2024-11-08 04:52:20.662344] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:45.747 [2024-11-08 04:52:20.662388] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:45.747 #14 NEW cov: 10790 ft: 13283 corp: 3/69b lim: 90 exec/s: 0 rss: 68Mb L: 16/52 MS: 3 ChangeByte-CopyPart-CrossOver- 00:08:46.006 [2024-11-08 04:52:20.874544] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.006 [2024-11-08 04:52:20.874577] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.006 NEW_FUNC[1/1]: 0x191add8 in get_rusage /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/event/reactor.c:609 00:08:46.006 #20 NEW cov: 10807 ft: 14284 corp: 4/85b lim: 90 exec/s: 0 rss: 69Mb L: 16/52 MS: 1 ChangeBinInt- 00:08:46.006 [2024-11-08 04:52:21.075305] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.006 [2024-11-08 04:52:21.075336] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.264 #21 NEW cov: 10807 ft: 15173 corp: 5/101b lim: 90 exec/s: 21 rss: 69Mb L: 16/52 MS: 1 ChangeByte- 00:08:46.264 [2024-11-08 04:52:21.277486] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.264 [2024-11-08 04:52:21.277516] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.522 #27 NEW cov: 10807 ft: 15464 corp: 6/150b lim: 90 exec/s: 27 rss: 70Mb L: 49/52 MS: 1 EraseBytes- 00:08:46.522 [2024-11-08 04:52:21.476301] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.522 [2024-11-08 04:52:21.476332] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.522 #28 NEW cov: 10807 ft: 15566 corp: 7/191b lim: 90 exec/s: 28 rss: 70Mb L: 41/52 MS: 1 InsertRepeatedBytes- 00:08:46.782 [2024-11-08 04:52:21.668571] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.782 [2024-11-08 04:52:21.668600] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:46.782 #29 NEW cov: 10807 ft: 16053 corp: 8/232b lim: 90 exec/s: 29 rss: 70Mb L: 41/52 MS: 1 ChangeBinInt- 00:08:46.782 [2024-11-08 04:52:21.861486] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:46.782 [2024-11-08 04:52:21.861516] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.041 #30 NEW cov: 10814 ft: 16219 corp: 9/262b lim: 90 exec/s: 30 rss: 70Mb L: 30/52 MS: 1 InsertRepeatedBytes- 00:08:47.041 [2024-11-08 04:52:22.055465] vfio_user.c:3096:vfio_user_log: *ERROR*: /tmp/vfio-user-6/domain/1: msg0: cmd 8 failed: Invalid argument 00:08:47.041 [2024-11-08 04:52:22.055495] vfio_user.c: 144:vfio_user_read: *ERROR*: Command 8 return failure 00:08:47.300 #31 NEW cov: 10814 ft: 16328 corp: 10/278b lim: 90 exec/s: 15 rss: 70Mb L: 16/52 MS: 1 ChangeBit- 00:08:47.301 #31 DONE cov: 10814 ft: 16328 corp: 10/278b lim: 90 exec/s: 15 rss: 70Mb 00:08:47.301 Done 31 runs in 2 second(s) 00:08:47.560 04:52:22 -- vfio/run.sh@49 -- # rm -rf /tmp/vfio-user-6 00:08:47.560 04:52:22 -- ../common.sh@72 -- # (( i++ )) 00:08:47.560 04:52:22 -- ../common.sh@72 -- # (( i < fuzz_num )) 00:08:47.560 04:52:22 -- vfio/run.sh@75 -- # trap - SIGINT SIGTERM EXIT 00:08:47.560 00:08:47.560 real 0m19.363s 00:08:47.560 user 0m27.076s 00:08:47.560 sys 0m1.819s 00:08:47.560 04:52:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:47.560 04:52:22 -- common/autotest_common.sh@10 -- # set +x 00:08:47.560 ************************************ 00:08:47.560 END TEST vfio_fuzz 00:08:47.560 ************************************ 00:08:47.560 00:08:47.560 real 1m23.956s 00:08:47.560 user 2m7.737s 00:08:47.560 sys 0m9.389s 00:08:47.560 04:52:22 -- common/autotest_common.sh@1115 -- # xtrace_disable 00:08:47.560 04:52:22 -- common/autotest_common.sh@10 -- # set +x 00:08:47.560 ************************************ 00:08:47.560 END TEST llvm_fuzz 00:08:47.560 ************************************ 00:08:47.560 04:52:22 -- spdk/autotest.sh@365 -- # [[ 0 -eq 1 ]] 00:08:47.560 04:52:22 -- spdk/autotest.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:08:47.560 04:52:22 -- spdk/autotest.sh@372 -- # timing_enter post_cleanup 00:08:47.560 04:52:22 -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:47.560 04:52:22 -- common/autotest_common.sh@10 -- # set +x 00:08:47.560 04:52:22 -- spdk/autotest.sh@373 -- # autotest_cleanup 00:08:47.560 04:52:22 -- common/autotest_common.sh@1381 -- # local autotest_es=0 00:08:47.560 04:52:22 -- common/autotest_common.sh@1382 -- # xtrace_disable 00:08:47.560 04:52:22 -- common/autotest_common.sh@10 -- # set +x 00:08:54.131 INFO: APP EXITING 00:08:54.131 INFO: killing all VMs 00:08:54.131 INFO: killing vhost app 00:08:54.131 INFO: EXIT DONE 00:08:56.700 Waiting for block devices as requested 00:08:56.700 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:56.700 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:57.014 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:57.014 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:57.014 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:57.014 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:57.014 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:57.274 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:57.274 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:08:57.274 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:08:57.533 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:08:57.533 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:08:57.533 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:08:57.793 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:08:57.793 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:08:57.793 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:08:58.052 0000:d8:00.0 (8086 0a54): vfio-pci -> nvme 00:09:01.342 Cleaning 00:09:01.342 Removing: /dev/shm/spdk_tgt_trace.pid3655000 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3652517 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3653787 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3655000 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3655799 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3656117 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3656459 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3656804 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3657136 00:09:01.342 Removing: /var/run/dpdk/spdk_pid3657421 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3657711 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3658035 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3658902 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3662100 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3662409 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3662719 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3662971 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3663546 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3663568 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3664136 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3664406 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3664698 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3664737 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3665015 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3665260 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3665658 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3665944 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3666235 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3666409 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3666631 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3666829 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3666949 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3667218 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3667501 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3667750 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3667944 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3668112 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3668370 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3668639 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3668920 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3669194 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3669475 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3669748 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3669975 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3670137 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3670352 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3670608 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3670901 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3671168 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3671452 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3671724 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3671942 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3672102 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3672326 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3672586 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3672867 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3673141 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3673422 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3673694 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3673933 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3674103 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3674310 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3674559 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3674848 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3675122 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3675407 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3675682 00:09:01.601 Removing: /var/run/dpdk/spdk_pid3675968 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3676122 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3676342 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3676547 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3676831 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3677115 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3677509 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3678069 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3678684 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3679421 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3680145 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3680490 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3681033 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3681465 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3681866 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3682408 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3682790 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3683240 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3683783 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3684091 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3684621 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3685150 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3685450 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3685987 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3686519 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3686840 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3687377 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3687834 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3688216 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3688760 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3689154 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3689592 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3690220 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3690759 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3691176 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3691605 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3692143 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3692693 00:09:01.861 Removing: /var/run/dpdk/spdk_pid3693087 00:09:01.861 Clean 00:09:02.120 killing process with pid 3604361 00:09:06.319 killing process with pid 3604358 00:09:06.319 killing process with pid 3604360 00:09:06.319 killing process with pid 3604359 00:09:06.319 04:52:40 -- common/autotest_common.sh@1446 -- # return 0 00:09:06.319 04:52:40 -- spdk/autotest.sh@374 -- # timing_exit post_cleanup 00:09:06.319 04:52:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:06.319 04:52:40 -- common/autotest_common.sh@10 -- # set +x 00:09:06.319 04:52:40 -- spdk/autotest.sh@376 -- # timing_exit autotest 00:09:06.319 04:52:40 -- common/autotest_common.sh@728 -- # xtrace_disable 00:09:06.319 04:52:40 -- common/autotest_common.sh@10 -- # set +x 00:09:06.319 04:52:40 -- spdk/autotest.sh@377 -- # chmod a+r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:06.319 04:52:40 -- spdk/autotest.sh@379 -- # [[ -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log ]] 00:09:06.319 04:52:40 -- spdk/autotest.sh@379 -- # rm -f /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/udev.log 00:09:06.319 04:52:40 -- spdk/autotest.sh@381 -- # [[ y == y ]] 00:09:06.319 04:52:40 -- spdk/autotest.sh@383 -- # hostname 00:09:06.319 04:52:40 -- spdk/autotest.sh@383 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -c --no-external -d /var/jenkins/workspace/short-fuzz-phy-autotest/spdk -t spdk-wfp-20 -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info 00:09:06.319 geninfo: WARNING: invalid characters removed from testname! 00:09:10.516 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_p2l_upgrade.gcda 00:09:10.516 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_band_upgrade.gcda 00:09:10.516 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/lib/ftl/upgrade/ftl_chunk_upgrade.gcda 00:09:17.099 04:52:52 -- spdk/autotest.sh@384 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:25.221 04:52:58 -- spdk/autotest.sh@385 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:28.510 04:53:03 -- spdk/autotest.sh@389 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:33.785 04:53:08 -- spdk/autotest.sh@390 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:38.057 04:53:13 -- spdk/autotest.sh@391 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:43.331 04:53:17 -- spdk/autotest.sh@392 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh -q -r /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/cov_total.info 00:09:47.526 04:53:22 -- spdk/autotest.sh@393 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:09:47.526 04:53:22 -- common/autotest_common.sh@1689 -- $ [[ y == y ]] 00:09:47.526 04:53:22 -- common/autotest_common.sh@1690 -- $ lcov --version 00:09:47.526 04:53:22 -- common/autotest_common.sh@1690 -- $ awk '{print $NF}' 00:09:47.526 04:53:22 -- common/autotest_common.sh@1690 -- $ lt 1.15 2 00:09:47.526 04:53:22 -- scripts/common.sh@372 -- $ cmp_versions 1.15 '<' 2 00:09:47.526 04:53:22 -- scripts/common.sh@332 -- $ local ver1 ver1_l 00:09:47.526 04:53:22 -- scripts/common.sh@333 -- $ local ver2 ver2_l 00:09:47.526 04:53:22 -- scripts/common.sh@335 -- $ IFS=.-: 00:09:47.526 04:53:22 -- scripts/common.sh@335 -- $ read -ra ver1 00:09:47.526 04:53:22 -- scripts/common.sh@336 -- $ IFS=.-: 00:09:47.526 04:53:22 -- scripts/common.sh@336 -- $ read -ra ver2 00:09:47.526 04:53:22 -- scripts/common.sh@337 -- $ local 'op=<' 00:09:47.526 04:53:22 -- scripts/common.sh@339 -- $ ver1_l=2 00:09:47.526 04:53:22 -- scripts/common.sh@340 -- $ ver2_l=1 00:09:47.526 04:53:22 -- scripts/common.sh@342 -- $ local lt=0 gt=0 eq=0 v 00:09:47.526 04:53:22 -- scripts/common.sh@343 -- $ case "$op" in 00:09:47.527 04:53:22 -- scripts/common.sh@344 -- $ : 1 00:09:47.527 04:53:22 -- scripts/common.sh@363 -- $ (( v = 0 )) 00:09:47.527 04:53:22 -- scripts/common.sh@363 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:47.527 04:53:22 -- scripts/common.sh@364 -- $ decimal 1 00:09:47.527 04:53:22 -- scripts/common.sh@352 -- $ local d=1 00:09:47.527 04:53:22 -- scripts/common.sh@353 -- $ [[ 1 =~ ^[0-9]+$ ]] 00:09:47.527 04:53:22 -- scripts/common.sh@354 -- $ echo 1 00:09:47.527 04:53:22 -- scripts/common.sh@364 -- $ ver1[v]=1 00:09:47.527 04:53:22 -- scripts/common.sh@365 -- $ decimal 2 00:09:47.527 04:53:22 -- scripts/common.sh@352 -- $ local d=2 00:09:47.527 04:53:22 -- scripts/common.sh@353 -- $ [[ 2 =~ ^[0-9]+$ ]] 00:09:47.527 04:53:22 -- scripts/common.sh@354 -- $ echo 2 00:09:47.527 04:53:22 -- scripts/common.sh@365 -- $ ver2[v]=2 00:09:47.527 04:53:22 -- scripts/common.sh@366 -- $ (( ver1[v] > ver2[v] )) 00:09:47.527 04:53:22 -- scripts/common.sh@367 -- $ (( ver1[v] < ver2[v] )) 00:09:47.527 04:53:22 -- scripts/common.sh@367 -- $ return 0 00:09:47.527 04:53:22 -- common/autotest_common.sh@1691 -- $ lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:47.527 04:53:22 -- common/autotest_common.sh@1703 -- $ export 'LCOV_OPTS= 00:09:47.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:47.527 --rc genhtml_branch_coverage=1 00:09:47.527 --rc genhtml_function_coverage=1 00:09:47.527 --rc genhtml_legend=1 00:09:47.527 --rc geninfo_all_blocks=1 00:09:47.527 --rc geninfo_unexecuted_blocks=1 00:09:47.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:47.527 ' 00:09:47.527 04:53:22 -- common/autotest_common.sh@1703 -- $ LCOV_OPTS=' 00:09:47.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:47.527 --rc genhtml_branch_coverage=1 00:09:47.527 --rc genhtml_function_coverage=1 00:09:47.527 --rc genhtml_legend=1 00:09:47.527 --rc geninfo_all_blocks=1 00:09:47.527 --rc geninfo_unexecuted_blocks=1 00:09:47.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:47.527 ' 00:09:47.527 04:53:22 -- common/autotest_common.sh@1704 -- $ export 'LCOV=lcov 00:09:47.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:47.527 --rc genhtml_branch_coverage=1 00:09:47.527 --rc genhtml_function_coverage=1 00:09:47.527 --rc genhtml_legend=1 00:09:47.527 --rc geninfo_all_blocks=1 00:09:47.527 --rc geninfo_unexecuted_blocks=1 00:09:47.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:47.527 ' 00:09:47.527 04:53:22 -- common/autotest_common.sh@1704 -- $ LCOV='lcov 00:09:47.527 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:47.527 --rc genhtml_branch_coverage=1 00:09:47.527 --rc genhtml_function_coverage=1 00:09:47.527 --rc genhtml_legend=1 00:09:47.527 --rc geninfo_all_blocks=1 00:09:47.527 --rc geninfo_unexecuted_blocks=1 00:09:47.527 --gcov-tool /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/test/fuzz/llvm/llvm-gcov.sh 00:09:47.527 ' 00:09:47.527 04:53:22 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/scripts/common.sh 00:09:47.527 04:53:22 -- scripts/common.sh@433 -- $ [[ -e /bin/wpdk_common.sh ]] 00:09:47.527 04:53:22 -- scripts/common.sh@441 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:47.527 04:53:22 -- scripts/common.sh@442 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:47.527 04:53:22 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:47.527 04:53:22 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:47.527 04:53:22 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:47.527 04:53:22 -- paths/export.sh@5 -- $ export PATH 00:09:47.527 04:53:22 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:47.527 04:53:22 -- common/autobuild_common.sh@439 -- $ out=/var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output 00:09:47.527 04:53:22 -- common/autobuild_common.sh@440 -- $ date +%s 00:09:47.527 04:53:22 -- common/autobuild_common.sh@440 -- $ mktemp -dt spdk_1731038002.XXXXXX 00:09:47.527 04:53:22 -- common/autobuild_common.sh@440 -- $ SPDK_WORKSPACE=/tmp/spdk_1731038002.UC0pNy 00:09:47.527 04:53:22 -- common/autobuild_common.sh@442 -- $ [[ -n '' ]] 00:09:47.527 04:53:22 -- common/autobuild_common.sh@446 -- $ '[' -n '' ']' 00:09:47.527 04:53:22 -- common/autobuild_common.sh@449 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/' 00:09:47.527 04:53:22 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp' 00:09:47.527 04:53:22 -- common/autobuild_common.sh@455 -- $ scanbuild='scan-build -o /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:09:47.527 04:53:22 -- common/autobuild_common.sh@456 -- $ get_config_params 00:09:47.527 04:53:22 -- common/autotest_common.sh@397 -- $ xtrace_disable 00:09:47.527 04:53:22 -- common/autotest_common.sh@10 -- $ set +x 00:09:47.527 04:53:22 -- common/autobuild_common.sh@456 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-coverage --with-ublk --with-vfio-user' 00:09:47.527 04:53:22 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j112 00:09:47.527 04:53:22 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:47.527 04:53:22 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:09:47.527 04:53:22 -- spdk/autopackage.sh@18 -- $ [[ 1 -eq 0 ]] 00:09:47.527 04:53:22 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:09:47.527 04:53:22 -- spdk/autopackage.sh@19 -- $ timing_finish 00:09:47.527 04:53:22 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:09:47.527 04:53:22 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:09:47.527 04:53:22 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/short-fuzz-phy-autotest/spdk/../output/timing.txt 00:09:47.787 04:53:22 -- spdk/autopackage.sh@20 -- $ exit 0 00:09:47.787 + [[ -n 3560424 ]] 00:09:47.787 + sudo kill 3560424 00:09:47.796 [Pipeline] } 00:09:47.811 [Pipeline] // stage 00:09:47.817 [Pipeline] } 00:09:47.831 [Pipeline] // timeout 00:09:47.836 [Pipeline] } 00:09:47.850 [Pipeline] // catchError 00:09:47.855 [Pipeline] } 00:09:47.870 [Pipeline] // wrap 00:09:47.876 [Pipeline] } 00:09:47.889 [Pipeline] // catchError 00:09:47.899 [Pipeline] stage 00:09:47.901 [Pipeline] { (Epilogue) 00:09:47.914 [Pipeline] catchError 00:09:47.916 [Pipeline] { 00:09:47.930 [Pipeline] echo 00:09:47.932 Cleanup processes 00:09:47.938 [Pipeline] sh 00:09:48.225 + sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:48.225 3702503 sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:48.240 [Pipeline] sh 00:09:48.526 ++ sudo pgrep -af /var/jenkins/workspace/short-fuzz-phy-autotest/spdk 00:09:48.526 ++ grep -v 'sudo pgrep' 00:09:48.526 ++ awk '{print $1}' 00:09:48.526 + sudo kill -9 00:09:48.526 + true 00:09:48.538 [Pipeline] sh 00:09:48.823 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:09:48.823 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:48.823 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:09:50.202 xz: Reduced the number of threads from 112 to 89 to not exceed the memory usage limit of 14,718 MiB 00:10:00.198 [Pipeline] sh 00:10:00.483 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:10:00.483 Artifacts sizes are good 00:10:00.497 [Pipeline] archiveArtifacts 00:10:00.504 Archiving artifacts 00:10:00.643 [Pipeline] sh 00:10:00.952 + sudo chown -R sys_sgci: /var/jenkins/workspace/short-fuzz-phy-autotest 00:10:00.967 [Pipeline] cleanWs 00:10:00.977 [WS-CLEANUP] Deleting project workspace... 00:10:00.977 [WS-CLEANUP] Deferred wipeout is used... 00:10:00.984 [WS-CLEANUP] done 00:10:00.986 [Pipeline] } 00:10:01.004 [Pipeline] // catchError 00:10:01.017 [Pipeline] sh 00:10:01.301 + logger -p user.info -t JENKINS-CI 00:10:01.310 [Pipeline] } 00:10:01.323 [Pipeline] // stage 00:10:01.328 [Pipeline] } 00:10:01.343 [Pipeline] // node 00:10:01.348 [Pipeline] End of Pipeline 00:10:01.385 Finished: SUCCESS